[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 25201 1726882678.29020: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 25201 1726882678.29462: Added group all to inventory 25201 1726882678.29466: Added group ungrouped to inventory 25201 1726882678.29470: Group all now contains ungrouped 25201 1726882678.29474: Examining possible inventory source: /tmp/network-91m/inventory.yml 25201 1726882678.51125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 25201 1726882678.51184: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 25201 1726882678.51207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 25201 1726882678.51271: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 25201 1726882678.51344: Loaded config def from plugin (inventory/script) 25201 1726882678.51346: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 25201 1726882678.51388: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 25201 1726882678.51474: Loaded config def from plugin (inventory/yaml) 25201 1726882678.51477: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 25201 1726882678.51561: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 25201 1726882678.52682: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 25201 1726882678.52686: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 25201 1726882678.52689: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 25201 1726882678.52696: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 25201 1726882678.52700: Loading data from /tmp/network-91m/inventory.yml 25201 1726882678.52770: /tmp/network-91m/inventory.yml was not parsable by auto 25201 1726882678.52837: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 25201 1726882678.52878: Loading data from /tmp/network-91m/inventory.yml 25201 1726882678.52960: group all already in inventory 25201 1726882678.52968: set inventory_file for managed_node1 25201 1726882678.52973: set inventory_dir for managed_node1 25201 1726882678.52974: Added host managed_node1 to inventory 25201 1726882678.52976: Added host managed_node1 to group all 25201 1726882678.52977: set ansible_host for managed_node1 25201 1726882678.52978: set ansible_ssh_extra_args for managed_node1 25201 1726882678.52982: set inventory_file for managed_node2 25201 1726882678.52984: set inventory_dir for managed_node2 25201 1726882678.52985: Added host managed_node2 to inventory 25201 1726882678.52987: Added host managed_node2 to group all 25201 1726882678.52988: set ansible_host for managed_node2 25201 1726882678.52988: set ansible_ssh_extra_args for managed_node2 25201 1726882678.52991: set inventory_file for managed_node3 25201 1726882678.52994: set inventory_dir for managed_node3 25201 1726882678.52994: Added host managed_node3 to inventory 25201 1726882678.52996: Added host managed_node3 to group all 25201 1726882678.52997: set ansible_host for managed_node3 25201 1726882678.52997: set ansible_ssh_extra_args for managed_node3 25201 1726882678.53000: Reconcile groups and hosts in inventory. 25201 1726882678.53004: Group ungrouped now contains managed_node1 25201 1726882678.53006: Group ungrouped now contains managed_node2 25201 1726882678.53008: Group ungrouped now contains managed_node3 25201 1726882678.53091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 25201 1726882678.53918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 25201 1726882678.53970: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 25201 1726882678.53998: Loaded config def from plugin (vars/host_group_vars) 25201 1726882678.54000: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 25201 1726882678.54007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 25201 1726882678.54015: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 25201 1726882678.54058: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 25201 1726882678.54384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882678.54489: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 25201 1726882678.54529: Loaded config def from plugin (connection/local) 25201 1726882678.54532: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 25201 1726882678.55131: Loaded config def from plugin (connection/paramiko_ssh) 25201 1726882678.55134: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 25201 1726882678.56306: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25201 1726882678.56347: Loaded config def from plugin (connection/psrp) 25201 1726882678.56351: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 25201 1726882678.57073: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25201 1726882678.57111: Loaded config def from plugin (connection/ssh) 25201 1726882678.57114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 25201 1726882678.59777: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25201 1726882678.59817: Loaded config def from plugin (connection/winrm) 25201 1726882678.59820: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 25201 1726882678.59851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 25201 1726882678.59916: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 25201 1726882678.59987: Loaded config def from plugin (shell/cmd) 25201 1726882678.59990: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 25201 1726882678.60015: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 25201 1726882678.60083: Loaded config def from plugin (shell/powershell) 25201 1726882678.60086: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 25201 1726882678.60139: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 25201 1726882678.60322: Loaded config def from plugin (shell/sh) 25201 1726882678.60324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 25201 1726882678.60358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 25201 1726882678.60626: Loaded config def from plugin (become/runas) 25201 1726882678.60629: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 25201 1726882678.60827: Loaded config def from plugin (become/su) 25201 1726882678.60829: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 25201 1726882678.60991: Loaded config def from plugin (become/sudo) 25201 1726882678.60993: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 25201 1726882678.61027: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25201 1726882678.61431: in VariableManager get_vars() 25201 1726882678.61454: done with get_vars() 25201 1726882678.61602: trying /usr/local/lib/python3.12/site-packages/ansible/modules 25201 1726882678.64802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 25201 1726882678.64941: in VariableManager get_vars() 25201 1726882678.64945: done with get_vars() 25201 1726882678.64948: variable 'playbook_dir' from source: magic vars 25201 1726882678.64949: variable 'ansible_playbook_python' from source: magic vars 25201 1726882678.64950: variable 'ansible_config_file' from source: magic vars 25201 1726882678.64951: variable 'groups' from source: magic vars 25201 1726882678.64951: variable 'omit' from source: magic vars 25201 1726882678.64952: variable 'ansible_version' from source: magic vars 25201 1726882678.64953: variable 'ansible_check_mode' from source: magic vars 25201 1726882678.64954: variable 'ansible_diff_mode' from source: magic vars 25201 1726882678.64954: variable 'ansible_forks' from source: magic vars 25201 1726882678.64955: variable 'ansible_inventory_sources' from source: magic vars 25201 1726882678.64956: variable 'ansible_skip_tags' from source: magic vars 25201 1726882678.64957: variable 'ansible_limit' from source: magic vars 25201 1726882678.64957: variable 'ansible_run_tags' from source: magic vars 25201 1726882678.64958: variable 'ansible_verbosity' from source: magic vars 25201 1726882678.65331: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 25201 1726882678.66357: in VariableManager get_vars() 25201 1726882678.66380: done with get_vars() 25201 1726882678.66419: in VariableManager get_vars() 25201 1726882678.66432: done with get_vars() 25201 1726882678.66931: in VariableManager get_vars() 25201 1726882678.66946: done with get_vars() 25201 1726882678.66950: variable 'omit' from source: magic vars 25201 1726882678.67128: variable 'omit' from source: magic vars 25201 1726882678.67159: in VariableManager get_vars() 25201 1726882678.67235: done with get_vars() 25201 1726882678.67305: in VariableManager get_vars() 25201 1726882678.67318: done with get_vars() 25201 1726882678.67353: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25201 1726882678.67600: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25201 1726882678.67760: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25201 1726882678.68779: in VariableManager get_vars() 25201 1726882678.68798: done with get_vars() 25201 1726882678.69223: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 25201 1726882678.69360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25201 1726882678.71386: in VariableManager get_vars() 25201 1726882678.71404: done with get_vars() 25201 1726882678.71442: in VariableManager get_vars() 25201 1726882678.71482: done with get_vars() 25201 1726882678.72604: in VariableManager get_vars() 25201 1726882678.72622: done with get_vars() 25201 1726882678.72627: variable 'omit' from source: magic vars 25201 1726882678.72638: variable 'omit' from source: magic vars 25201 1726882678.72677: in VariableManager get_vars() 25201 1726882678.72692: done with get_vars() 25201 1726882678.72712: in VariableManager get_vars() 25201 1726882678.72729: done with get_vars() 25201 1726882678.72760: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25201 1726882678.73092: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25201 1726882678.73187: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25201 1726882678.73595: in VariableManager get_vars() 25201 1726882678.73617: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25201 1726882678.75682: in VariableManager get_vars() 25201 1726882678.75702: done with get_vars() 25201 1726882678.75814: in VariableManager get_vars() 25201 1726882678.75915: done with get_vars() 25201 1726882678.75970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 25201 1726882678.75985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 25201 1726882678.76217: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 25201 1726882678.80511: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 25201 1726882678.80515: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 25201 1726882678.80547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 25201 1726882678.80582: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 25201 1726882678.80758: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 25201 1726882678.80941: Loaded config def from plugin (callback/default) 25201 1726882678.80944: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25201 1726882678.82250: Loaded config def from plugin (callback/junit) 25201 1726882678.82253: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25201 1726882678.82307: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 25201 1726882678.82375: Loaded config def from plugin (callback/minimal) 25201 1726882678.82378: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25201 1726882678.82422: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 25201 1726882678.82483: Loaded config def from plugin (callback/tree) 25201 1726882678.82486: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 25201 1726882678.82617: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 25201 1726882678.82620: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25201 1726882678.82680: in VariableManager get_vars() 25201 1726882678.82694: done with get_vars() 25201 1726882678.82700: in VariableManager get_vars() 25201 1726882678.82709: done with get_vars() 25201 1726882678.82712: variable 'omit' from source: magic vars 25201 1726882678.83024: in VariableManager get_vars() 25201 1726882678.83039: done with get_vars() 25201 1726882678.83060: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 25201 1726882678.83681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 25201 1726882678.83758: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 25201 1726882678.83789: getting the remaining hosts for this loop 25201 1726882678.83791: done getting the remaining hosts for this loop 25201 1726882678.83793: getting the next task for host managed_node2 25201 1726882678.83797: done getting next task for host managed_node2 25201 1726882678.83799: ^ task is: TASK: Gathering Facts 25201 1726882678.83801: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882678.83803: getting variables 25201 1726882678.83804: in VariableManager get_vars() 25201 1726882678.83813: Calling all_inventory to load vars for managed_node2 25201 1726882678.83815: Calling groups_inventory to load vars for managed_node2 25201 1726882678.83818: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882678.83830: Calling all_plugins_play to load vars for managed_node2 25201 1726882678.83843: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882678.83847: Calling groups_plugins_play to load vars for managed_node2 25201 1726882678.83884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882678.83938: done with get_vars() 25201 1726882678.83944: done getting variables 25201 1726882678.84008: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Friday 20 September 2024 21:37:58 -0400 (0:00:00.014) 0:00:00.014 ****** 25201 1726882678.84029: entering _queue_task() for managed_node2/gather_facts 25201 1726882678.84030: Creating lock for gather_facts 25201 1726882678.84409: worker is 1 (out of 1 available) 25201 1726882678.84434: exiting _queue_task() for managed_node2/gather_facts 25201 1726882678.84461: done queuing things up, now waiting for results queue to drain 25201 1726882678.84471: waiting for pending results... 25201 1726882678.84733: running TaskExecutor() for managed_node2/TASK: Gathering Facts 25201 1726882678.84860: in run() - task 0e448fcc-3ce9-313b-197e-0000000000b9 25201 1726882678.84882: variable 'ansible_search_path' from source: unknown 25201 1726882678.84930: calling self._execute() 25201 1726882678.85011: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882678.85022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882678.85042: variable 'omit' from source: magic vars 25201 1726882678.85184: variable 'omit' from source: magic vars 25201 1726882678.85215: variable 'omit' from source: magic vars 25201 1726882678.85274: variable 'omit' from source: magic vars 25201 1726882678.85321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882678.85368: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882678.85399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882678.85422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882678.85440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882678.85485: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882678.85499: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882678.85506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882678.85619: Set connection var ansible_shell_executable to /bin/sh 25201 1726882678.85630: Set connection var ansible_pipelining to False 25201 1726882678.85640: Set connection var ansible_connection to ssh 25201 1726882678.85650: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882678.85658: Set connection var ansible_shell_type to sh 25201 1726882678.85673: Set connection var ansible_timeout to 10 25201 1726882678.85708: variable 'ansible_shell_executable' from source: unknown 25201 1726882678.85720: variable 'ansible_connection' from source: unknown 25201 1726882678.85727: variable 'ansible_module_compression' from source: unknown 25201 1726882678.85734: variable 'ansible_shell_type' from source: unknown 25201 1726882678.85741: variable 'ansible_shell_executable' from source: unknown 25201 1726882678.85748: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882678.85755: variable 'ansible_pipelining' from source: unknown 25201 1726882678.85762: variable 'ansible_timeout' from source: unknown 25201 1726882678.85772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882678.86009: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882678.86031: variable 'omit' from source: magic vars 25201 1726882678.86045: starting attempt loop 25201 1726882678.86053: running the handler 25201 1726882678.86075: variable 'ansible_facts' from source: unknown 25201 1726882678.86099: _low_level_execute_command(): starting 25201 1726882678.86112: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882678.86888: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882678.86911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882678.86927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882678.86962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882678.87011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882678.87058: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882678.87076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882678.87094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882678.87105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882678.87132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882678.87157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882678.87174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882678.87191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882678.87209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882678.87222: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882678.87255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882678.87334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882678.87362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882678.87386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882678.87521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882678.89176: stdout chunk (state=3): >>>/root <<< 25201 1726882678.89358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882678.89362: stdout chunk (state=3): >>><<< 25201 1726882678.89367: stderr chunk (state=3): >>><<< 25201 1726882678.89468: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882678.89472: _low_level_execute_command(): starting 25201 1726882678.89475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430 `" && echo ansible-tmp-1726882678.8938484-25232-173801811291430="` echo /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430 `" ) && sleep 0' 25201 1726882678.90609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882678.90618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882678.90650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882678.90653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882678.90657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882678.90660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882678.91125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882678.91133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882678.91282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882678.93096: stdout chunk (state=3): >>>ansible-tmp-1726882678.8938484-25232-173801811291430=/root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430 <<< 25201 1726882678.93267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882678.93273: stdout chunk (state=3): >>><<< 25201 1726882678.93282: stderr chunk (state=3): >>><<< 25201 1726882678.93297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882678.8938484-25232-173801811291430=/root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882678.93329: variable 'ansible_module_compression' from source: unknown 25201 1726882678.93386: ANSIBALLZ: Using generic lock for ansible.legacy.setup 25201 1726882678.93389: ANSIBALLZ: Acquiring lock 25201 1726882678.93391: ANSIBALLZ: Lock acquired: 140300039193808 25201 1726882678.93394: ANSIBALLZ: Creating module 25201 1726882679.47237: ANSIBALLZ: Writing module into payload 25201 1726882679.47416: ANSIBALLZ: Writing module 25201 1726882679.47452: ANSIBALLZ: Renaming module 25201 1726882679.47468: ANSIBALLZ: Done creating module 25201 1726882679.48203: variable 'ansible_facts' from source: unknown 25201 1726882679.48215: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882679.48227: _low_level_execute_command(): starting 25201 1726882679.48236: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 25201 1726882679.48880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882679.49579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.49594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.49610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.49652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.49660: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882679.49673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.49686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882679.49694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882679.49700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882679.49708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.49717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.49728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.49735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.49741: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882679.49750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.49831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882679.49850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882679.49863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882679.50002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882679.51651: stdout chunk (state=3): >>>PLATFORM <<< 25201 1726882679.51729: stdout chunk (state=3): >>>Linux <<< 25201 1726882679.51754: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 <<< 25201 1726882679.51757: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 25201 1726882679.51979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882679.51982: stdout chunk (state=3): >>><<< 25201 1726882679.51985: stderr chunk (state=3): >>><<< 25201 1726882679.52116: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882679.52125 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 25201 1726882679.52128: _low_level_execute_command(): starting 25201 1726882679.52131: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 25201 1726882679.52922: Sending initial data 25201 1726882679.52925: Sent initial data (1181 bytes) 25201 1726882679.53435: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882679.53882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.53896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.53914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.53955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.53971: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882679.53986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.54004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882679.54016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882679.54027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882679.54039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.54052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.54073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.54086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.54097: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882679.54110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.54189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882679.54209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882679.54224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882679.54348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882679.58089: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 25201 1726882679.58769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882679.58773: stdout chunk (state=3): >>><<< 25201 1726882679.58775: stderr chunk (state=3): >>><<< 25201 1726882679.58778: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882679.58780: variable 'ansible_facts' from source: unknown 25201 1726882679.58782: variable 'ansible_facts' from source: unknown 25201 1726882679.58784: variable 'ansible_module_compression' from source: unknown 25201 1726882679.58787: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25201 1726882679.58788: variable 'ansible_facts' from source: unknown 25201 1726882679.58853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/AnsiballZ_setup.py 25201 1726882679.59389: Sending initial data 25201 1726882679.59393: Sent initial data (154 bytes) 25201 1726882679.62496: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882679.62731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.62745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.62765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.62870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.62945: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882679.62960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.62983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882679.62997: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882679.63008: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882679.63021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.63034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.63054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.63166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.63180: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882679.63192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.63271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882679.63387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882679.63410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882679.63540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882679.65303: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882679.65399: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882679.65499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp5h8u5jfb /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/AnsiballZ_setup.py <<< 25201 1726882679.65593: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882679.68571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882679.68670: stderr chunk (state=3): >>><<< 25201 1726882679.68674: stdout chunk (state=3): >>><<< 25201 1726882679.68799: done transferring module to remote 25201 1726882679.68802: _low_level_execute_command(): starting 25201 1726882679.68805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/ /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/AnsiballZ_setup.py && sleep 0' 25201 1726882679.69363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882679.69381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.69395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.69413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.69460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.69479: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882679.69494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.69512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882679.69524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882679.69535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882679.69546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.69561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.69587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.69599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.69610: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882679.69623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.69708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882679.69741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882679.69758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882679.69892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882679.71710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882679.71714: stdout chunk (state=3): >>><<< 25201 1726882679.71716: stderr chunk (state=3): >>><<< 25201 1726882679.71804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882679.71810: _low_level_execute_command(): starting 25201 1726882679.71813: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/AnsiballZ_setup.py && sleep 0' 25201 1726882679.73312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882679.73325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.73342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.73367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.73408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.73466: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882679.73484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.73502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882679.73514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882679.73524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882679.73536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882679.73549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882679.73584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882679.73596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882679.73605: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882679.73617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882679.73814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882679.73835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882679.73850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882679.73991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882679.75933: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 25201 1726882679.75940: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 25201 1726882679.76000: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 25201 1726882679.76035: stdout chunk (state=3): >>>import 'posix' # <<< 25201 1726882679.76063: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 25201 1726882679.76072: stdout chunk (state=3): >>># installing zipimport hook <<< 25201 1726882679.76106: stdout chunk (state=3): >>>import 'time' # <<< 25201 1726882679.76111: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 25201 1726882679.76162: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.76184: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 25201 1726882679.76205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 25201 1726882679.76214: stdout chunk (state=3): >>>import '_codecs' # <<< 25201 1726882679.76225: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3dc0> <<< 25201 1726882679.76284: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 25201 1726882679.76290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3b20> <<< 25201 1726882679.76316: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 25201 1726882679.76327: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3ac0> <<< 25201 1726882679.76353: stdout chunk (state=3): >>>import '_signal' # <<< 25201 1726882679.76383: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 25201 1726882679.76392: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958490> <<< 25201 1726882679.76414: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 25201 1726882679.76435: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 25201 1726882679.76453: stdout chunk (state=3): >>>import '_abc' # <<< 25201 1726882679.76459: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958940> <<< 25201 1726882679.76483: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958670> <<< 25201 1726882679.76507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 25201 1726882679.76522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 25201 1726882679.76537: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 25201 1726882679.76567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 25201 1726882679.76582: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 25201 1726882679.76601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 25201 1726882679.76621: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f190> <<< 25201 1726882679.76638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 25201 1726882679.76666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 25201 1726882679.76734: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f220> <<< 25201 1726882679.76767: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 25201 1726882679.76795: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f940> <<< 25201 1726882679.76832: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda970880> <<< 25201 1726882679.76854: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 25201 1726882679.76863: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda908d90> <<< 25201 1726882679.76922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 25201 1726882679.76928: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda932d90> <<< 25201 1726882679.76979: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958970> <<< 25201 1726882679.77009: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25201 1726882679.77333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 25201 1726882679.77340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 25201 1726882679.77370: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 25201 1726882679.77383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 25201 1726882679.77394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 25201 1726882679.77409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 25201 1726882679.77428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 25201 1726882679.77442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 25201 1726882679.77448: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8aeeb0> <<< 25201 1726882679.77501: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8b1f40> <<< 25201 1726882679.77513: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 25201 1726882679.77530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 25201 1726882679.77540: stdout chunk (state=3): >>>import '_sre' # <<< 25201 1726882679.77567: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 25201 1726882679.77578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 25201 1726882679.77600: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 25201 1726882679.77624: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8a7610> <<< 25201 1726882679.77635: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8ad640> <<< 25201 1726882679.77650: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8ae370> <<< 25201 1726882679.77672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 25201 1726882679.77744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 25201 1726882679.77758: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 25201 1726882679.77801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.77814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 25201 1726882679.77849: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda830dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8308b0> <<< 25201 1726882679.77869: stdout chunk (state=3): >>>import 'itertools' # <<< 25201 1726882679.77895: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830eb0> <<< 25201 1726882679.77910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 25201 1726882679.77928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 25201 1726882679.77952: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830f70> <<< 25201 1726882679.77987: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 25201 1726882679.77999: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830e80> import '_collections' # <<< 25201 1726882679.78052: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda889d30> import '_functools' # <<< 25201 1726882679.78085: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda882610> <<< 25201 1726882679.78143: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda896670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8b5e20> <<< 25201 1726882679.78171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 25201 1726882679.78200: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda842c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda889250> <<< 25201 1726882679.78249: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882679.78256: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda896280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8bb9d0> <<< 25201 1726882679.78280: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 25201 1726882679.78304: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.78331: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 25201 1726882679.78339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 25201 1726882679.78355: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842d90> <<< 25201 1726882679.78394: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842d00> <<< 25201 1726882679.78413: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 25201 1726882679.78442: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 25201 1726882679.78448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 25201 1726882679.78475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 25201 1726882679.78519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 25201 1726882679.78551: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda572370> <<< 25201 1726882679.78574: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 25201 1726882679.78580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 25201 1726882679.78615: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda572460> <<< 25201 1726882679.78736: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda84afa0> <<< 25201 1726882679.78780: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844490> <<< 25201 1726882679.78810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 25201 1726882679.78817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 25201 1726882679.78851: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 25201 1726882679.78857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 25201 1726882679.78896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 25201 1726882679.78903: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda49b1c0> <<< 25201 1726882679.78931: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda55dc70> <<< 25201 1726882679.78987: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8bb040> <<< 25201 1726882679.79011: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 25201 1726882679.79028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 25201 1726882679.79061: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4adaf0> <<< 25201 1726882679.79069: stdout chunk (state=3): >>>import 'errno' # <<< 25201 1726882679.79098: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882679.79122: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4ade20> <<< 25201 1726882679.79130: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 25201 1726882679.79157: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 25201 1726882679.79166: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bf730> <<< 25201 1726882679.79179: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 25201 1726882679.79216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 25201 1726882679.79240: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bfc70> <<< 25201 1726882679.79287: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda44c3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4adf10> <<< 25201 1726882679.79307: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 25201 1726882679.79366: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda45d280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bf5b0> <<< 25201 1726882679.79373: stdout chunk (state=3): >>>import 'pwd' # <<< 25201 1726882679.79398: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda45d340> <<< 25201 1726882679.79437: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8429d0> <<< 25201 1726882679.79451: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 25201 1726882679.79472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 25201 1726882679.79486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 25201 1726882679.79500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 25201 1726882679.79535: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4786a0> <<< 25201 1726882679.79549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 25201 1726882679.79589: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda478760> <<< 25201 1726882679.79610: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478850> <<< 25201 1726882679.79632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 25201 1726882679.79639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 25201 1726882679.79831: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478ca0> <<< 25201 1726882679.79867: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882679.79879: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4851f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4788e0> <<< 25201 1726882679.79887: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda46ca30> <<< 25201 1726882679.79900: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8425b0> <<< 25201 1726882679.79924: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 25201 1726882679.79982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 25201 1726882679.80018: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda478a90> <<< 25201 1726882679.80151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 25201 1726882679.80167: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0bda39c670> <<< 25201 1726882679.80423: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 25201 1726882679.80516: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.80548: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 25201 1726882679.80567: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.80580: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 25201 1726882679.80594: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.81812: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.82736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae7c0> <<< 25201 1726882679.82754: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.82785: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 25201 1726882679.82804: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 25201 1726882679.82835: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9dae160> <<< 25201 1726882679.82879: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae280> <<< 25201 1726882679.82906: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9daef10> <<< 25201 1726882679.82926: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 25201 1726882679.82982: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9daed30> <<< 25201 1726882679.82991: stdout chunk (state=3): >>>import 'atexit' # <<< 25201 1726882679.83013: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9daef70> <<< 25201 1726882679.83024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 25201 1726882679.83056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 25201 1726882679.83097: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae100> <<< 25201 1726882679.83120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 25201 1726882679.83126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 25201 1726882679.83150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 25201 1726882679.83170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 25201 1726882679.83191: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 25201 1726882679.83285: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d83130> <<< 25201 1726882679.83314: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9c870d0> <<< 25201 1726882679.83344: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9c872b0> <<< 25201 1726882679.83358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 25201 1726882679.83375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 25201 1726882679.83409: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9c87c40> <<< 25201 1726882679.83424: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d95dc0> <<< 25201 1726882679.83592: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d953a0> <<< 25201 1726882679.83611: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 25201 1726882679.83634: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d95f70> <<< 25201 1726882679.83648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 25201 1726882679.83664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 25201 1726882679.83704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 25201 1726882679.83716: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 25201 1726882679.83733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 25201 1726882679.83760: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 25201 1726882679.83766: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9de3c10> <<< 25201 1726882679.83847: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db1cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db13a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d62b80> <<< 25201 1726882679.83879: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9db14c0> <<< 25201 1726882679.83905: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db14f0> <<< 25201 1726882679.83924: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 25201 1726882679.83942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 25201 1726882679.83954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 25201 1726882679.83998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 25201 1726882679.84059: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce5250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df51f0> <<< 25201 1726882679.84089: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 25201 1726882679.84096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 25201 1726882679.84154: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9cf28e0> <<< 25201 1726882679.84166: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df5370> <<< 25201 1726882679.84176: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 25201 1726882679.84218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.84243: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 25201 1726882679.84250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 25201 1726882679.84310: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df5ca0> <<< 25201 1726882679.84439: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf2880> <<< 25201 1726882679.84532: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce58b0> <<< 25201 1726882679.84560: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d8e190> <<< 25201 1726882679.84626: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9df5670> <<< 25201 1726882679.84640: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9ded8b0> <<< 25201 1726882679.84662: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 25201 1726882679.84676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 25201 1726882679.84721: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce79d0> <<< 25201 1726882679.84912: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d04b80> <<< 25201 1726882679.84915: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf1640> <<< 25201 1726882679.84959: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf1a30> <<< 25201 1726882679.84986: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.84999: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 25201 1726882679.85067: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.85155: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882679.85192: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 25201 1726882679.85205: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.85300: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.85398: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.85833: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.86292: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 25201 1726882679.86330: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 25201 1726882679.86333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.86388: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d2d7c0> <<< 25201 1726882679.86465: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 25201 1726882679.86480: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d32820> <<< 25201 1726882679.86483: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98819a0> <<< 25201 1726882679.86518: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 25201 1726882679.86536: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882679.86565: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 25201 1726882679.86691: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.86818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 25201 1726882679.86846: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d6c760> # zipimport: zlib available <<< 25201 1726882679.87238: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87602: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87653: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87722: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 25201 1726882679.87758: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87788: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 25201 1726882679.87802: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87938: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 25201 1726882679.87970: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 25201 1726882679.87973: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.87997: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88042: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 25201 1726882679.88045: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88229: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88415: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 25201 1726882679.88446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 25201 1726882679.88524: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db03d0> # zipimport: zlib available <<< 25201 1726882679.88591: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88660: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 25201 1726882679.88690: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 25201 1726882679.88694: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88725: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88772: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 25201 1726882679.88775: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88805: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88844: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88933: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.88997: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 25201 1726882679.89017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.89094: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d249a0> <<< 25201 1726882679.89181: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd96fbbe0> <<< 25201 1726882679.89219: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 25201 1726882679.89222: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89278: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89326: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89355: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89388: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 25201 1726882679.89404: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 25201 1726882679.89416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 25201 1726882679.89460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 25201 1726882679.89474: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 25201 1726882679.89499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 25201 1726882679.89574: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d35670> <<< 25201 1726882679.89615: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d80d90> <<< 25201 1726882679.89675: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db0400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 25201 1726882679.89681: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89705: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89726: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 25201 1726882679.89798: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 25201 1726882679.89824: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882679.89837: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 25201 1726882679.89843: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89899: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89952: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89975: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.89987: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90025: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90061: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90093: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90129: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 25201 1726882679.90134: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90198: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90267: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90282: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90315: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 25201 1726882679.90491: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90605: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90640: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.90688: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882679.90712: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 25201 1726882679.90731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 25201 1726882679.90745: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 25201 1726882679.90778: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98a0550> <<< 25201 1726882679.90798: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 25201 1726882679.90808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 25201 1726882679.90825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 25201 1726882679.90848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 25201 1726882679.90880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 25201 1726882679.90895: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9862a90> <<< 25201 1726882679.90933: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9862a00> <<< 25201 1726882679.91001: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9893760> <<< 25201 1726882679.91014: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98a0e80> <<< 25201 1726882679.91034: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95fff10> <<< 25201 1726882679.91048: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95ffaf0> <<< 25201 1726882679.91068: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 25201 1726882679.91087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 25201 1726882679.91107: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 25201 1726882679.91113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 25201 1726882679.91144: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882679.91157: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d91cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd984c160> <<< 25201 1726882679.91187: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 25201 1726882679.91193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 25201 1726882679.91219: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d912e0> <<< 25201 1726882679.91233: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 25201 1726882679.91256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 25201 1726882679.91287: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9667fa0> <<< 25201 1726882679.91313: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9890dc0> <<< 25201 1726882679.91350: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95ffdc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 25201 1726882679.91365: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 25201 1726882679.91385: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 25201 1726882679.91404: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91450: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91509: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 25201 1726882679.91553: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91605: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 25201 1726882679.91622: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882679.91637: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 25201 1726882679.91663: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91699: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 25201 1726882679.91706: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91747: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91791: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 25201 1726882679.91831: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91875: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 25201 1726882679.91881: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91929: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.91984: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92022: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92088: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 25201 1726882679.92094: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92476: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92838: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 25201 1726882679.92881: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92933: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92959: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.92996: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 25201 1726882679.93004: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93027: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93059: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 25201 1726882679.93068: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93113: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93166: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 25201 1726882679.93172: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93192: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93222: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 25201 1726882679.93260: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93285: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 25201 1726882679.93292: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93351: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 25201 1726882679.93455: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9884670> <<< 25201 1726882679.93470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 25201 1726882679.93498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 25201 1726882679.93654: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9580f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 25201 1726882679.93661: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93718: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93779: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 25201 1726882679.93786: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93855: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.93937: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 25201 1726882679.93999: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94073: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 25201 1726882679.94076: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94104: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94154: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 25201 1726882679.94170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 25201 1726882679.94322: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9571c10> <<< 25201 1726882679.94566: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95beb20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 25201 1726882679.94573: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94618: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94671: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 25201 1726882679.94746: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94813: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.94913: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95051: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 25201 1726882679.95055: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95090: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95125: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 25201 1726882679.95131: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95160: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 25201 1726882679.95258: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd94fa4f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94faa30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 25201 1726882679.95291: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882679.95303: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 25201 1726882679.95346: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95392: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 25201 1726882679.95398: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95523: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95651: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 25201 1726882679.95739: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95820: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95855: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.95899: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 25201 1726882679.95906: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 25201 1726882679.95990: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96005: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96124: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96250: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 25201 1726882679.96257: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96359: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96465: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 25201 1726882679.96477: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96497: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96526: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.96969: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.97377: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 25201 1726882679.97399: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.97479: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.97571: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 25201 1726882679.97654: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.97740: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 25201 1726882679.97747: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.97868: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98005: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 25201 1726882679.98020: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 25201 1726882679.98035: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98074: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98114: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 25201 1726882679.98120: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98204: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98283: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98453: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98626: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 25201 1726882679.98632: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98667: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98706: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 25201 1726882679.98719: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98728: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98751: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 25201 1726882679.98822: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98884: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 25201 1726882679.98897: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98908: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.98934: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 25201 1726882679.98989: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99045: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 25201 1726882679.99052: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99091: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99148: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 25201 1726882679.99368: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99587: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 25201 1726882679.99595: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99635: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99690: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 25201 1726882679.99726: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99749: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 25201 1726882679.99767: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99790: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99829: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 25201 1726882679.99866: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99887: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 25201 1726882679.99903: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882679.99966: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00038: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 25201 1726882680.00067: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 25201 1726882680.00080: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00114: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00156: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 25201 1726882680.00196: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00202: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00242: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00284: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00344: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00416: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 25201 1726882680.00435: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00470: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00520: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 25201 1726882680.00685: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00845: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 25201 1726882680.00852: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00884: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.00930: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 25201 1726882680.00976: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.01019: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 25201 1726882680.01023: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.01084: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.01165: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 25201 1726882680.01171: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.01235: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.01317: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 25201 1726882680.01390: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.02160: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 25201 1726882680.02197: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 25201 1726882680.02204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 25201 1726882680.02239: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd932a0d0> <<< 25201 1726882680.02248: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932a730> <<< 25201 1726882680.02300: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932aa30> <<< 25201 1726882680.04251: stdout chunk (state=3): >>>import 'gc' # <<< 25201 1726882680.06257: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 25201 1726882680.06276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 25201 1726882680.06303: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932afa0> <<< 25201 1726882680.06333: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 25201 1726882680.06358: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95455b0> <<< 25201 1726882680.06416: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 25201 1726882680.06425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.06456: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94ed280> <<< 25201 1726882680.06473: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94ed790> <<< 25201 1726882680.06749: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 25201 1726882680.06756: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 25201 1726882680.31249: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "00", "epoch": "1726882680", "epoch_int": "1726882680", "date": "2024-09-20", "time": "21:38:00", "iso8601_micro": "2024-09-21T01:38:00.038611Z", "iso8601": "2024-09-21T01:38:00Z", "iso8601_basic": "20240920T213800038611", "iso8601_basic_short": "20240920T213800", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user<<< 25201 1726882680.31311: stdout chunk (state=3): >>>_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3253, "used": 279}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,<<< 25201 1726882680.31317: stdout chunk (state=3): >>>attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238632960, "block_size": 4096, "block_total": 65519355, "block_available": 64511385, "block_used": 1007970, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fips": false, "ansible_loadavg": {"1m": 0.47, "5m": 0.41, "15m": 0.23}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fi<<< 25201 1726882680.31325: stdout chunk (state=3): >>>xed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25201 1726882680.31851: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout <<< 25201 1726882680.31953: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json<<< 25201 1726882680.32061: stdout chunk (state=3): >>> # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 25201 1726882680.32306: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts<<< 25201 1726882680.32310: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 25201 1726882680.32598: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25201 1726882680.32617: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 25201 1726882680.32658: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 25201 1726882680.32698: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 25201 1726882680.32722: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 25201 1726882680.32762: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 25201 1726882680.32832: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 25201 1726882680.32860: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 25201 1726882680.32904: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 <<< 25201 1726882680.32942: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 25201 1726882680.32971: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 25201 1726882680.33089: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 25201 1726882680.33177: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 25201 1726882680.33259: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 25201 1726882680.33262: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 25201 1726882680.33315: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 25201 1726882680.33499: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 25201 1726882680.33543: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 25201 1726882680.33566: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 25201 1726882680.33584: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 25201 1726882680.33610: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 25201 1726882680.34034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882680.34037: stdout chunk (state=3): >>><<< 25201 1726882680.34039: stderr chunk (state=3): >>><<< 25201 1726882680.34180: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda9b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda90f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda970880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda908d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda932d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda958970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8aeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8b1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda830dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8308b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda830e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda889d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda882610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda896670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda842c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda889250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda896280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8bb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda842d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda572370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda572460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda84afa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda49b1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda55dc70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda844eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8bb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4adaf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4ade20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bf730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bfc70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda44c3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4adf10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda45d280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4bf5b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda45d340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8429d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4786a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda478760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda478ca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bda4851f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda4788e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda46ca30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda8425b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bda478a90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0bda39c670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae7c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9dae160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9daef10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9daed30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9daef70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9dae100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d83130> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9c870d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9c872b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9c87c40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d95dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d953a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d95f70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9de3c10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db1cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db13a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d62b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9db14c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db14f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce5250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df51f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9cf28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df5370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9df5ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf2880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce58b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d8e190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9df5670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9ded8b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce79d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d04b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf1640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9ce7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9cf1a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d2d7c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d32820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98819a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d6c760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db03d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d249a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd96fbbe0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d35670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d80d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9db0400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98a0550> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9862a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9862a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9893760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd98a0e80> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95fff10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95ffaf0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9d91cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd984c160> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9d912e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9667fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9890dc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95ffdc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9884670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd9580f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd9571c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95beb20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd94fa4f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94faa30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dotltlm2/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0bd932a0d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932a730> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932aa30> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd932afa0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd95455b0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94ed280> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0bd94ed790> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "00", "epoch": "1726882680", "epoch_int": "1726882680", "date": "2024-09-20", "time": "21:38:00", "iso8601_micro": "2024-09-21T01:38:00.038611Z", "iso8601": "2024-09-21T01:38:00Z", "iso8601_basic": "20240920T213800038611", "iso8601_basic_short": "20240920T213800", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3253, "used": 279}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238632960, "block_size": 4096, "block_total": 65519355, "block_available": 64511385, "block_used": 1007970, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fips": false, "ansible_loadavg": {"1m": 0.47, "5m": 0.41, "15m": 0.23}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 25201 1726882680.35652: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882680.35655: _low_level_execute_command(): starting 25201 1726882680.35658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882678.8938484-25232-173801811291430/ > /dev/null 2>&1 && sleep 0' 25201 1726882680.36556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.36573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.36589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.36618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.36660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.36677: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.36692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.36721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.36734: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.36745: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.36758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.36775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.36792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.36804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.36816: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.36841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.36917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.36949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.36968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.37100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882680.38942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882680.38945: stdout chunk (state=3): >>><<< 25201 1726882680.38948: stderr chunk (state=3): >>><<< 25201 1726882680.39568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882680.39571: handler run complete 25201 1726882680.39574: variable 'ansible_facts' from source: unknown 25201 1726882680.39576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.39578: variable 'ansible_facts' from source: unknown 25201 1726882680.39580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.39687: attempt loop complete, returning result 25201 1726882680.39696: _execute() done 25201 1726882680.39702: dumping result to json 25201 1726882680.39736: done dumping result, returning 25201 1726882680.39750: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-313b-197e-0000000000b9] 25201 1726882680.39760: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b9 ok: [managed_node2] 25201 1726882680.41693: no more pending results, returning what we have 25201 1726882680.41696: results queue empty 25201 1726882680.41697: checking for any_errors_fatal 25201 1726882680.41698: done checking for any_errors_fatal 25201 1726882680.41699: checking for max_fail_percentage 25201 1726882680.41701: done checking for max_fail_percentage 25201 1726882680.41701: checking to see if all hosts have failed and the running result is not ok 25201 1726882680.41702: done checking to see if all hosts have failed 25201 1726882680.41703: getting the remaining hosts for this loop 25201 1726882680.41705: done getting the remaining hosts for this loop 25201 1726882680.41708: getting the next task for host managed_node2 25201 1726882680.41713: done getting next task for host managed_node2 25201 1726882680.41715: ^ task is: TASK: meta (flush_handlers) 25201 1726882680.41717: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882680.41734: getting variables 25201 1726882680.41737: in VariableManager get_vars() 25201 1726882680.41759: Calling all_inventory to load vars for managed_node2 25201 1726882680.41765: Calling groups_inventory to load vars for managed_node2 25201 1726882680.41784: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882680.41807: Calling all_plugins_play to load vars for managed_node2 25201 1726882680.41811: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882680.41815: Calling groups_plugins_play to load vars for managed_node2 25201 1726882680.42020: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b9 25201 1726882680.42023: WORKER PROCESS EXITING 25201 1726882680.42038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.42248: done with get_vars() 25201 1726882680.42260: done getting variables 25201 1726882680.42332: in VariableManager get_vars() 25201 1726882680.42342: Calling all_inventory to load vars for managed_node2 25201 1726882680.42344: Calling groups_inventory to load vars for managed_node2 25201 1726882680.42347: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882680.42351: Calling all_plugins_play to load vars for managed_node2 25201 1726882680.42353: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882680.42362: Calling groups_plugins_play to load vars for managed_node2 25201 1726882680.42509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.42700: done with get_vars() 25201 1726882680.42714: done queuing things up, now waiting for results queue to drain 25201 1726882680.42716: results queue empty 25201 1726882680.42717: checking for any_errors_fatal 25201 1726882680.42719: done checking for any_errors_fatal 25201 1726882680.42720: checking for max_fail_percentage 25201 1726882680.42720: done checking for max_fail_percentage 25201 1726882680.42721: checking to see if all hosts have failed and the running result is not ok 25201 1726882680.42722: done checking to see if all hosts have failed 25201 1726882680.42723: getting the remaining hosts for this loop 25201 1726882680.42724: done getting the remaining hosts for this loop 25201 1726882680.42726: getting the next task for host managed_node2 25201 1726882680.42730: done getting next task for host managed_node2 25201 1726882680.42732: ^ task is: TASK: Include the task 'el_repo_setup.yml' 25201 1726882680.42734: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882680.42736: getting variables 25201 1726882680.42737: in VariableManager get_vars() 25201 1726882680.42744: Calling all_inventory to load vars for managed_node2 25201 1726882680.42746: Calling groups_inventory to load vars for managed_node2 25201 1726882680.42748: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882680.42752: Calling all_plugins_play to load vars for managed_node2 25201 1726882680.42754: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882680.42756: Calling groups_plugins_play to load vars for managed_node2 25201 1726882680.42891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.43085: done with get_vars() 25201 1726882680.43094: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Friday 20 September 2024 21:38:00 -0400 (0:00:01.591) 0:00:01.606 ****** 25201 1726882680.43162: entering _queue_task() for managed_node2/include_tasks 25201 1726882680.43166: Creating lock for include_tasks 25201 1726882680.43409: worker is 1 (out of 1 available) 25201 1726882680.43420: exiting _queue_task() for managed_node2/include_tasks 25201 1726882680.43430: done queuing things up, now waiting for results queue to drain 25201 1726882680.43432: waiting for pending results... 25201 1726882680.43796: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 25201 1726882680.43878: in run() - task 0e448fcc-3ce9-313b-197e-000000000006 25201 1726882680.43899: variable 'ansible_search_path' from source: unknown 25201 1726882680.43932: calling self._execute() 25201 1726882680.44002: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882680.44012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882680.44024: variable 'omit' from source: magic vars 25201 1726882680.44126: _execute() done 25201 1726882680.44134: dumping result to json 25201 1726882680.44140: done dumping result, returning 25201 1726882680.44150: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-313b-197e-000000000006] 25201 1726882680.44158: sending task result for task 0e448fcc-3ce9-313b-197e-000000000006 25201 1726882680.44296: no more pending results, returning what we have 25201 1726882680.44302: in VariableManager get_vars() 25201 1726882680.44333: Calling all_inventory to load vars for managed_node2 25201 1726882680.44336: Calling groups_inventory to load vars for managed_node2 25201 1726882680.44339: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882680.44352: Calling all_plugins_play to load vars for managed_node2 25201 1726882680.44355: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882680.44358: Calling groups_plugins_play to load vars for managed_node2 25201 1726882680.44706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.45384: done with get_vars() 25201 1726882680.45391: variable 'ansible_search_path' from source: unknown 25201 1726882680.45405: we have included files to process 25201 1726882680.45406: generating all_blocks data 25201 1726882680.45408: done generating all_blocks data 25201 1726882680.45409: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25201 1726882680.45410: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25201 1726882680.45413: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25201 1726882680.45863: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000006 25201 1726882680.45874: WORKER PROCESS EXITING 25201 1726882680.46463: in VariableManager get_vars() 25201 1726882680.46504: done with get_vars() 25201 1726882680.46518: done processing included file 25201 1726882680.46520: iterating over new_blocks loaded from include file 25201 1726882680.46521: in VariableManager get_vars() 25201 1726882680.46531: done with get_vars() 25201 1726882680.46533: filtering new block on tags 25201 1726882680.46547: done filtering new block on tags 25201 1726882680.46550: in VariableManager get_vars() 25201 1726882680.46561: done with get_vars() 25201 1726882680.46562: filtering new block on tags 25201 1726882680.46581: done filtering new block on tags 25201 1726882680.46584: in VariableManager get_vars() 25201 1726882680.46612: done with get_vars() 25201 1726882680.46614: filtering new block on tags 25201 1726882680.46627: done filtering new block on tags 25201 1726882680.46629: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 25201 1726882680.46634: extending task lists for all hosts with included blocks 25201 1726882680.46723: done extending task lists 25201 1726882680.46724: done processing included files 25201 1726882680.46725: results queue empty 25201 1726882680.46726: checking for any_errors_fatal 25201 1726882680.46727: done checking for any_errors_fatal 25201 1726882680.46728: checking for max_fail_percentage 25201 1726882680.46729: done checking for max_fail_percentage 25201 1726882680.46730: checking to see if all hosts have failed and the running result is not ok 25201 1726882680.46730: done checking to see if all hosts have failed 25201 1726882680.46731: getting the remaining hosts for this loop 25201 1726882680.46732: done getting the remaining hosts for this loop 25201 1726882680.46735: getting the next task for host managed_node2 25201 1726882680.46739: done getting next task for host managed_node2 25201 1726882680.46741: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 25201 1726882680.46743: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882680.46744: getting variables 25201 1726882680.46745: in VariableManager get_vars() 25201 1726882680.46753: Calling all_inventory to load vars for managed_node2 25201 1726882680.46755: Calling groups_inventory to load vars for managed_node2 25201 1726882680.46757: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882680.46767: Calling all_plugins_play to load vars for managed_node2 25201 1726882680.46770: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882680.46773: Calling groups_plugins_play to load vars for managed_node2 25201 1726882680.46902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882680.47088: done with get_vars() 25201 1726882680.47095: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:38:00 -0400 (0:00:00.039) 0:00:01.646 ****** 25201 1726882680.47157: entering _queue_task() for managed_node2/setup 25201 1726882680.47360: worker is 1 (out of 1 available) 25201 1726882680.47374: exiting _queue_task() for managed_node2/setup 25201 1726882680.47384: done queuing things up, now waiting for results queue to drain 25201 1726882680.47386: waiting for pending results... 25201 1726882680.47609: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 25201 1726882680.47699: in run() - task 0e448fcc-3ce9-313b-197e-0000000000ca 25201 1726882680.47715: variable 'ansible_search_path' from source: unknown 25201 1726882680.47727: variable 'ansible_search_path' from source: unknown 25201 1726882680.47884: calling self._execute() 25201 1726882680.47972: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882680.47988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882680.48001: variable 'omit' from source: magic vars 25201 1726882680.48736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882680.51019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882680.51104: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882680.51141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882680.51181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882680.51253: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882680.51369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882680.51438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882680.52258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882680.52357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882680.52382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882680.52814: variable 'ansible_facts' from source: unknown 25201 1726882680.52998: variable 'network_test_required_facts' from source: task vars 25201 1726882680.53043: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 25201 1726882680.53105: variable 'omit' from source: magic vars 25201 1726882680.53150: variable 'omit' from source: magic vars 25201 1726882680.53281: variable 'omit' from source: magic vars 25201 1726882680.53329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882680.53370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882680.53391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882680.53411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882680.53429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882680.53467: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882680.53477: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882680.53486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882680.53593: Set connection var ansible_shell_executable to /bin/sh 25201 1726882680.53603: Set connection var ansible_pipelining to False 25201 1726882680.53611: Set connection var ansible_connection to ssh 25201 1726882680.53619: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882680.53625: Set connection var ansible_shell_type to sh 25201 1726882680.53641: Set connection var ansible_timeout to 10 25201 1726882680.53674: variable 'ansible_shell_executable' from source: unknown 25201 1726882680.53685: variable 'ansible_connection' from source: unknown 25201 1726882680.53691: variable 'ansible_module_compression' from source: unknown 25201 1726882680.53697: variable 'ansible_shell_type' from source: unknown 25201 1726882680.53703: variable 'ansible_shell_executable' from source: unknown 25201 1726882680.53708: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882680.53715: variable 'ansible_pipelining' from source: unknown 25201 1726882680.53720: variable 'ansible_timeout' from source: unknown 25201 1726882680.53726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882680.53876: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882680.53896: variable 'omit' from source: magic vars 25201 1726882680.53907: starting attempt loop 25201 1726882680.53913: running the handler 25201 1726882680.53928: _low_level_execute_command(): starting 25201 1726882680.53938: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882680.54763: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.54781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.54800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.54817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.54861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.54879: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.54893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.54916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.54928: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.54943: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.54956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.54973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.54993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.55005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.55014: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.55031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.55115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.55139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.55167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.55297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882680.57587: stdout chunk (state=3): >>>/root <<< 25201 1726882680.57591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882680.57593: stdout chunk (state=3): >>><<< 25201 1726882680.57596: stderr chunk (state=3): >>><<< 25201 1726882680.57702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882680.57707: _low_level_execute_command(): starting 25201 1726882680.57711: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636 `" && echo ansible-tmp-1726882680.5761964-25285-156143968590636="` echo /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636 `" ) && sleep 0' 25201 1726882680.58481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.58526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.58541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.58558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.58624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.58638: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.58651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.58676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.58979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.58990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.59006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.59019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.59033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.59044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.59053: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.59070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.59154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.59183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.59201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.59330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882680.61458: stdout chunk (state=3): >>>ansible-tmp-1726882680.5761964-25285-156143968590636=/root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636 <<< 25201 1726882680.61614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882680.61680: stderr chunk (state=3): >>><<< 25201 1726882680.61685: stdout chunk (state=3): >>><<< 25201 1726882680.61974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882680.5761964-25285-156143968590636=/root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882680.61977: variable 'ansible_module_compression' from source: unknown 25201 1726882680.61980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25201 1726882680.61982: variable 'ansible_facts' from source: unknown 25201 1726882680.62060: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/AnsiballZ_setup.py 25201 1726882680.62230: Sending initial data 25201 1726882680.62233: Sent initial data (154 bytes) 25201 1726882680.63921: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.63937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.63951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.63979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.64035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.64081: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.64094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.64145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.64158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.64172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.64185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.64200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.64220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.64238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.64250: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.64267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.64454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.64476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.64496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.64629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882680.67032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882680.67130: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882680.67236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpe1ku1mh4 /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/AnsiballZ_setup.py <<< 25201 1726882680.67332: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882680.70362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882680.70609: stderr chunk (state=3): >>><<< 25201 1726882680.70612: stdout chunk (state=3): >>><<< 25201 1726882680.70615: done transferring module to remote 25201 1726882680.70617: _low_level_execute_command(): starting 25201 1726882680.70619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/ /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/AnsiballZ_setup.py && sleep 0' 25201 1726882680.71181: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.71194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.71207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.71227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.71270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.71284: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.71296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.71312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.71323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.71332: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.71342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.71354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.71373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.71385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.71395: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.71407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.71485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.71505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.71520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.71651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882680.74201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882680.74204: stdout chunk (state=3): >>><<< 25201 1726882680.74206: stderr chunk (state=3): >>><<< 25201 1726882680.74298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882680.74301: _low_level_execute_command(): starting 25201 1726882680.74304: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/AnsiballZ_setup.py && sleep 0' 25201 1726882680.74887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882680.74903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.74917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.74933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.74978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.74995: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882680.75009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.75026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882680.75038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882680.75049: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882680.75060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882680.75077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882680.75099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882680.75114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882680.75124: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882680.75136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882680.75225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882680.75244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882680.75258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882680.75398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882680.78157: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 25201 1726882680.78206: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 25201 1726882680.78238: stdout chunk (state=3): >>> import '_weakref' # <<< 25201 1726882680.78327: stdout chunk (state=3): >>>import '_io' # <<< 25201 1726882680.78330: stdout chunk (state=3): >>> <<< 25201 1726882680.78339: stdout chunk (state=3): >>>import 'marshal' # <<< 25201 1726882680.78395: stdout chunk (state=3): >>>import 'posix' # <<< 25201 1726882680.78399: stdout chunk (state=3): >>> <<< 25201 1726882680.78440: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 25201 1726882680.78457: stdout chunk (state=3): >>> <<< 25201 1726882680.78461: stdout chunk (state=3): >>># installing zipimport hook<<< 25201 1726882680.78472: stdout chunk (state=3): >>> <<< 25201 1726882680.78513: stdout chunk (state=3): >>>import 'time' # <<< 25201 1726882680.78516: stdout chunk (state=3): >>> <<< 25201 1726882680.78553: stdout chunk (state=3): >>>import 'zipimport' # <<< 25201 1726882680.78557: stdout chunk (state=3): >>> <<< 25201 1726882680.78559: stdout chunk (state=3): >>># installed zipimport hook<<< 25201 1726882680.78561: stdout chunk (state=3): >>> <<< 25201 1726882680.78628: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 25201 1726882680.78640: stdout chunk (state=3): >>> <<< 25201 1726882680.78651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.78690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py<<< 25201 1726882680.78692: stdout chunk (state=3): >>> <<< 25201 1726882680.78726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc'<<< 25201 1726882680.78729: stdout chunk (state=3): >>> <<< 25201 1726882680.78749: stdout chunk (state=3): >>>import '_codecs' # <<< 25201 1726882680.78754: stdout chunk (state=3): >>> <<< 25201 1726882680.78795: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911edc0><<< 25201 1726882680.78797: stdout chunk (state=3): >>> <<< 25201 1726882680.78848: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py<<< 25201 1726882680.78851: stdout chunk (state=3): >>> <<< 25201 1726882680.78893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 25201 1726882680.78923: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c33a0> <<< 25201 1726882680.78925: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911eb20> <<< 25201 1726882680.78972: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py<<< 25201 1726882680.78976: stdout chunk (state=3): >>> <<< 25201 1726882680.78978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc'<<< 25201 1726882680.78980: stdout chunk (state=3): >>> <<< 25201 1726882680.79011: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911eac0><<< 25201 1726882680.79014: stdout chunk (state=3): >>> <<< 25201 1726882680.79045: stdout chunk (state=3): >>>import '_signal' # <<< 25201 1726882680.79050: stdout chunk (state=3): >>> <<< 25201 1726882680.79088: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py<<< 25201 1726882680.79099: stdout chunk (state=3): >>> <<< 25201 1726882680.79102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc'<<< 25201 1726882680.79104: stdout chunk (state=3): >>> <<< 25201 1726882680.79131: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3490><<< 25201 1726882680.79141: stdout chunk (state=3): >>> <<< 25201 1726882680.79159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py<<< 25201 1726882680.79186: stdout chunk (state=3): >>> <<< 25201 1726882680.79189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 25201 1726882680.79213: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py<<< 25201 1726882680.79235: stdout chunk (state=3): >>> <<< 25201 1726882680.79245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 25201 1726882680.79277: stdout chunk (state=3): >>>import '_abc' # <<< 25201 1726882680.79295: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3940> <<< 25201 1726882680.79325: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3670><<< 25201 1726882680.79330: stdout chunk (state=3): >>> <<< 25201 1726882680.79373: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py<<< 25201 1726882680.79378: stdout chunk (state=3): >>> <<< 25201 1726882680.79398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc'<<< 25201 1726882680.79409: stdout chunk (state=3): >>> <<< 25201 1726882680.79437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py<<< 25201 1726882680.79442: stdout chunk (state=3): >>> <<< 25201 1726882680.79480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc'<<< 25201 1726882680.79484: stdout chunk (state=3): >>> <<< 25201 1726882680.79510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py<<< 25201 1726882680.79517: stdout chunk (state=3): >>> <<< 25201 1726882680.79546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 25201 1726882680.79592: stdout chunk (state=3): >>>import '_stat' # <<< 25201 1726882680.79603: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a190><<< 25201 1726882680.79608: stdout chunk (state=3): >>> <<< 25201 1726882680.79639: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py<<< 25201 1726882680.79642: stdout chunk (state=3): >>> <<< 25201 1726882680.79684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 25201 1726882680.79799: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a220><<< 25201 1726882680.79803: stdout chunk (state=3): >>> <<< 25201 1726882680.79837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py<<< 25201 1726882680.79849: stdout chunk (state=3): >>> <<< 25201 1726882680.79867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 25201 1726882680.79907: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py<<< 25201 1726882680.79934: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 25201 1726882680.79962: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499909d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a940><<< 25201 1726882680.79975: stdout chunk (state=3): >>> <<< 25201 1726882680.80006: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990db880> <<< 25201 1726882680.80080: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py<<< 25201 1726882680.80084: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 25201 1726882680.80086: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4999073d90> <<< 25201 1726882680.80149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py<<< 25201 1726882680.80174: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc'<<< 25201 1726882680.80195: stdout chunk (state=3): >>> import '_locale' # <<< 25201 1726882680.80221: stdout chunk (state=3): >>> import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499909dd90> <<< 25201 1726882680.80292: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3970><<< 25201 1726882680.80303: stdout chunk (state=3): >>> <<< 25201 1726882680.80341: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) <<< 25201 1726882680.80359: stdout chunk (state=3): >>> [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information.<<< 25201 1726882680.80375: stdout chunk (state=3): >>> <<< 25201 1726882680.80905: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 25201 1726882680.80929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc'<<< 25201 1726882680.80932: stdout chunk (state=3): >>> <<< 25201 1726882680.80977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py<<< 25201 1726882680.81009: stdout chunk (state=3): >>> <<< 25201 1726882680.81015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 25201 1726882680.81035: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 25201 1726882680.81087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc'<<< 25201 1726882680.81090: stdout chunk (state=3): >>> <<< 25201 1726882680.81122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 25201 1726882680.81152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 25201 1726882680.81178: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd3eb0> <<< 25201 1726882680.81248: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd5f40><<< 25201 1726882680.81252: stdout chunk (state=3): >>> <<< 25201 1726882680.81297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 25201 1726882680.81300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 25201 1726882680.81327: stdout chunk (state=3): >>>import '_sre' # <<< 25201 1726882680.81345: stdout chunk (state=3): >>> <<< 25201 1726882680.81372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 25201 1726882680.81403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc'<<< 25201 1726882680.81407: stdout chunk (state=3): >>> <<< 25201 1726882680.81449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py<<< 25201 1726882680.81452: stdout chunk (state=3): >>> <<< 25201 1726882680.81479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 25201 1726882680.81530: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dcc610> <<< 25201 1726882680.81595: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd2640><<< 25201 1726882680.81615: stdout chunk (state=3): >>> <<< 25201 1726882680.81629: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 25201 1726882680.81756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc'<<< 25201 1726882680.81793: stdout chunk (state=3): >>> <<< 25201 1726882680.81830: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 25201 1726882680.81889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc'<<< 25201 1726882680.81910: stdout chunk (state=3): >>> <<< 25201 1726882680.81913: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 25201 1726882680.82186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998c8edf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8e8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8eee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8efa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8eeb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998daed60> import '_functools' # <<< 25201 1726882680.82209: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998da7640> <<< 25201 1726882680.82287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 25201 1726882680.82294: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dba6a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ddadf0> <<< 25201 1726882680.82312: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 25201 1726882680.82350: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998ca1ca0> <<< 25201 1726882680.82353: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dae280> <<< 25201 1726882680.82388: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882680.82401: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998dba2b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998de09a0> <<< 25201 1726882680.82428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 25201 1726882680.82434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 25201 1726882680.82461: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 25201 1726882680.82465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.82486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 25201 1726882680.82498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 25201 1726882680.82512: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1fd0> <<< 25201 1726882680.82519: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1dc0> <<< 25201 1726882680.82543: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 25201 1726882680.82550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1d30> <<< 25201 1726882680.82573: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 25201 1726882680.82603: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 25201 1726882680.82610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 25201 1726882680.82635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 25201 1726882680.82698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 25201 1726882680.82728: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c743a0> <<< 25201 1726882680.82751: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 25201 1726882680.82766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 25201 1726882680.82831: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c74490> <<< 25201 1726882680.83013: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca9fd0> <<< 25201 1726882680.83061: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3a60> <<< 25201 1726882680.83073: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3580> <<< 25201 1726882680.83096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 25201 1726882680.83105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 25201 1726882680.83146: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 25201 1726882680.83156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 25201 1726882680.83194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc'<<< 25201 1726882680.83202: stdout chunk (state=3): >>> import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bc21f0> <<< 25201 1726882680.83241: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c5fb80> <<< 25201 1726882680.83311: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3ee0> <<< 25201 1726882680.83319: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ddafd0> <<< 25201 1726882680.83338: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 25201 1726882680.83373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 25201 1726882680.83396: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 25201 1726882680.83414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bd4b20> <<< 25201 1726882680.83418: stdout chunk (state=3): >>>import 'errno' # <<< 25201 1726882680.83478: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998bd4e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 25201 1726882680.83486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 25201 1726882680.83511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 25201 1726882680.83519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be6760> <<< 25201 1726882680.83582: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 25201 1726882680.83616: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be6ca0> <<< 25201 1726882680.83651: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882680.83688: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b7e3d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bd4f40> <<< 25201 1726882680.83701: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 25201 1726882680.83741: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882680.83758: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b8f2b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be65e0> <<< 25201 1726882680.83774: stdout chunk (state=3): >>>import 'pwd' # <<< 25201 1726882680.83803: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882680.83815: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b8f370> <<< 25201 1726882680.83836: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1a00> <<< 25201 1726882680.83918: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 25201 1726882680.84072: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa6d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baa790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 25201 1726882680.84084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 25201 1726882680.84620: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baacd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998bb7220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baa910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998b9ea60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca15e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baaac0> <<< 25201 1726882680.84877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f49985e76a0> <<< 25201 1726882680.85175: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip' # zipimport: zlib available <<< 25201 1726882680.85331: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.85365: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 25201 1726882680.85406: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.85410: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 25201 1726882680.85421: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.86938: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.87858: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985237f0> <<< 25201 1726882680.87889: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.87912: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 25201 1726882680.87939: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 25201 1726882680.87972: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998523160> <<< 25201 1726882680.88010: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523280> <<< 25201 1726882680.88036: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523f40> <<< 25201 1726882680.88063: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 25201 1726882680.88120: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523d60> <<< 25201 1726882680.88136: stdout chunk (state=3): >>>import 'atexit' # <<< 25201 1726882680.88164: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998523fa0> <<< 25201 1726882680.88169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 25201 1726882680.88195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 25201 1726882680.88228: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523100> <<< 25201 1726882680.88261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 25201 1726882680.88275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 25201 1726882680.88293: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 25201 1726882680.88315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 25201 1726882680.88328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 25201 1726882680.88419: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984fa100> <<< 25201 1726882680.88450: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49983ff100> <<< 25201 1726882680.88486: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882680.88515: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49983ff2e0> <<< 25201 1726882680.88518: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 25201 1726882680.88539: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49983ffc70> <<< 25201 1726882680.88565: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850adc0> <<< 25201 1726882680.88723: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850a3a0> <<< 25201 1726882680.88773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 25201 1726882680.88789: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850afa0> <<< 25201 1726882680.88802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 25201 1726882680.88835: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 25201 1726882680.88880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 25201 1726882680.88884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 25201 1726882680.88896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499855ac70> <<< 25201 1726882680.88977: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998505d00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985053d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984d8b50> <<< 25201 1726882680.89005: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49985054f0> <<< 25201 1726882680.89031: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998505520> <<< 25201 1726882680.89077: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 25201 1726882680.89080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 25201 1726882680.89093: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 25201 1726882680.89121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 25201 1726882680.89186: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845b310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856c220> <<< 25201 1726882680.89219: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 25201 1726882680.89279: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998467880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856c3a0> <<< 25201 1726882680.89299: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 25201 1726882680.89344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.89368: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 25201 1726882680.89380: stdout chunk (state=3): >>>import '_string' # <<< 25201 1726882680.89431: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856cca0> <<< 25201 1726882680.89559: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998467820> <<< 25201 1726882680.89653: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998504af0> <<< 25201 1726882680.89681: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499856c940> <<< 25201 1726882680.89741: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499856c5b0> <<< 25201 1726882680.89745: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985648e0> <<< 25201 1726882680.89779: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 25201 1726882680.89793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 25201 1726882680.89834: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845d970> <<< 25201 1726882680.90026: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499847ad60> <<< 25201 1726882680.90030: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984665e0> <<< 25201 1726882680.90081: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845df10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984669d0> <<< 25201 1726882680.90102: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.90116: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 25201 1726882680.90183: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.90273: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882680.90306: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 25201 1726882680.90322: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 25201 1726882680.90417: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.90514: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.90955: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.91432: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 25201 1726882680.91435: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 25201 1726882680.91474: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.91525: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49984797f0> <<< 25201 1726882680.91961: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984b4880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff89a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available <<< 25201 1726882680.92103: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 25201 1726882680.92116: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984e0730> # zipimport: zlib available <<< 25201 1726882680.92767: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93385: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93479: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93571: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 25201 1726882680.93592: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93636: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93698: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 25201 1726882680.93714: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93800: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93907: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 25201 1726882680.93935: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93972: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.93976: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 25201 1726882680.93987: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94037: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94082: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 25201 1726882680.94106: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94323: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 25201 1726882680.94602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 25201 1726882680.94665: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985263a0> <<< 25201 1726882680.94670: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94731: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94837: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 25201 1726882680.94841: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 25201 1726882680.94879: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.94910: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 25201 1726882680.94997: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95012: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882680.95116: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95199: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.95275: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998498610> <<< 25201 1726882680.95384: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997e89b50> <<< 25201 1726882680.95422: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 25201 1726882680.95425: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95478: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95536: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95549: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95603: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 25201 1726882680.95616: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 25201 1726882680.95627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 25201 1726882680.95653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 25201 1726882680.95689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 25201 1726882680.95701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 25201 1726882680.95784: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984ab6a0> <<< 25201 1726882680.95821: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984f7e50> <<< 25201 1726882680.95886: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998526850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 25201 1726882680.95905: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.95935: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 25201 1726882680.96022: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 25201 1726882680.96057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 25201 1726882680.96060: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96102: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96189: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96203: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96207: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96227: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96272: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96296: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96333: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 25201 1726882680.96408: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96477: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96492: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96527: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 25201 1726882680.96531: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96673: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96808: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96838: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.96893: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882680.96920: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 25201 1726882680.96944: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 25201 1726882680.96973: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff66d0> <<< 25201 1726882680.96997: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 25201 1726882680.97024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 25201 1726882680.97051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 25201 1726882680.97087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 25201 1726882680.97101: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997fda5e0> <<< 25201 1726882680.97141: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997fda670> <<< 25201 1726882680.97208: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499800a040> <<< 25201 1726882680.97227: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff6520> <<< 25201 1726882680.97254: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73be0> <<< 25201 1726882680.97294: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 25201 1726882680.97306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 25201 1726882680.97318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 25201 1726882680.97356: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998506d00> <<< 25201 1726882680.97359: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997fbae80> <<< 25201 1726882680.97391: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 25201 1726882680.97435: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985060d0> <<< 25201 1726882680.97448: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 25201 1726882680.97460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 25201 1726882680.97491: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997ddcfd0> <<< 25201 1726882680.97514: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998008e50> <<< 25201 1726882680.97573: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 25201 1726882680.97589: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.97604: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 25201 1726882680.97647: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.97710: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 25201 1726882680.97713: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.97764: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.97817: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 25201 1726882680.97844: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882680.97847: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 25201 1726882680.98399: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 25201 1726882680.99653: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 25201 1726882680.99670: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.99747: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.99836: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.99885: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882680.99940: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 25201 1726882680.99969: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 25201 1726882680.99982: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00025: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00067: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 25201 1726882681.00093: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00183: stdout chunk (state=3): >>># zipimport: zlib available<<< 25201 1726882681.00186: stdout chunk (state=3): >>> <<< 25201 1726882681.00252: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 25201 1726882681.00288: stdout chunk (state=3): >>># zipimport: zlib available<<< 25201 1726882681.00291: stdout chunk (state=3): >>> <<< 25201 1726882681.00319: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00366: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 25201 1726882681.00390: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00436: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00557: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available <<< 25201 1726882681.00678: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cf0e50> <<< 25201 1726882681.00720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 25201 1726882681.00885: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cf09d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 25201 1726882681.00888: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.00938: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01004: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 25201 1726882681.01008: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01080: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01157: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 25201 1726882681.01216: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01293: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 25201 1726882681.01324: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 25201 1726882681.01390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 25201 1726882681.01537: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997ceb790> <<< 25201 1726882681.01782: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ffb7f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 25201 1726882681.01833: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01890: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 25201 1726882681.01893: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.01953: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02029: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02120: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02266: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 25201 1726882681.02269: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02295: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02340: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 25201 1726882681.02343: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02375: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 25201 1726882681.02474: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997cb0310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cb0340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 25201 1726882681.02502: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882681.02516: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 25201 1726882681.02549: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02597: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 25201 1726882681.02600: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02727: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.02855: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 25201 1726882681.02940: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03022: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03057: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03086: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 25201 1726882681.03113: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 25201 1726882681.03201: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03215: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03327: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03454: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 25201 1726882681.03457: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03557: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03669: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 25201 1726882681.03679: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03692: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.03727: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.04149: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.04562: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 25201 1726882681.04580: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.04654: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.04752: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 25201 1726882681.04838: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.04929: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 25201 1726882681.04932: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05049: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05206: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 25201 1726882681.05224: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 25201 1726882681.05227: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05246: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05298: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 25201 1726882681.05301: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05386: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05455: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05630: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05805: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 25201 1726882681.05808: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05837: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05884: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 25201 1726882681.05912: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05916: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.05927: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 25201 1726882681.05989: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06065: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 25201 1726882681.06069: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06104: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06107: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 25201 1726882681.06154: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06217: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 25201 1726882681.06221: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06259: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06315: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 25201 1726882681.06533: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06760: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 25201 1726882681.06763: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06801: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06858: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 25201 1726882681.06893: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06925: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 25201 1726882681.06986: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.06998: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 25201 1726882681.07029: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07069: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 25201 1726882681.07129: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07233: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 25201 1726882681.07250: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 25201 1726882681.07253: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07278: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07333: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 25201 1726882681.07351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882681.07364: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07408: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07449: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07511: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07583: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 25201 1726882681.07596: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 25201 1726882681.07629: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.07685: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 25201 1726882681.07848: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08019: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 25201 1726882681.08051: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08099: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 25201 1726882681.08141: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08197: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 25201 1726882681.08200: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08256: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08344: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 25201 1726882681.08347: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08413: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08502: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 25201 1726882681.08505: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 25201 1726882681.08580: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.08804: stdout chunk (state=3): >>>import 'gc' # <<< 25201 1726882681.09703: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 25201 1726882681.09719: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 25201 1726882681.09755: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997cd3c40> <<< 25201 1726882681.09774: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997c627f0> <<< 25201 1726882681.09820: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997c622e0> <<< 25201 1726882681.11190: stdout chunk (state=3): >>> <<< 25201 1726882681.11243: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "01", "epoch": "1726882681", "epoch_int": "1726882681", "date": "2024-09-20", "time": "21:38:01", "iso8601_micro": "2024-09-21T01:38:01.090379Z", "iso8601": "2024-09-21T01:38:01Z", "iso8601_basic": "20240920T213801090379", "iso8601_bas<<< 25201 1726882681.11247: stdout chunk (state=3): >>>ic_short": "20240920T213801", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25201 1726882681.11908: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout <<< 25201 1726882681.12002: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy<<< 25201 1726882681.12088: stdout chunk (state=3): >>> # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle <<< 25201 1726882681.12197: stdout chunk (state=3): >>># cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix <<< 25201 1726882681.12244: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux <<< 25201 1726882681.12290: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 25201 1726882681.12589: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25201 1726882681.12611: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 25201 1726882681.12649: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 25201 1726882681.12684: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 25201 1726882681.12698: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 25201 1726882681.12719: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 25201 1726882681.12768: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 25201 1726882681.12838: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array <<< 25201 1726882681.12859: stdout chunk (state=3): >>># destroy _compat_pickle <<< 25201 1726882681.12887: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 25201 1726882681.12914: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 25201 1726882681.12949: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 25201 1726882681.12965: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 25201 1726882681.13039: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 25201 1726882681.13109: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 25201 1726882681.13174: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 25201 1726882681.13237: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 25201 1726882681.13290: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 25201 1726882681.13315: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 25201 1726882681.13511: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 25201 1726882681.13540: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 25201 1726882681.13576: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 25201 1726882681.13592: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 25201 1726882681.13622: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 25201 1726882681.14061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882681.14065: stdout chunk (state=3): >>><<< 25201 1726882681.14068: stderr chunk (state=3): >>><<< 25201 1726882681.14308: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499911eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499909d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499907a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4999073d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499909dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49990c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd5f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dcc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dd3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998c8edf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8e8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8eee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8efa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c8eeb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998daed60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998da7640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dba6a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ddadf0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998ca1ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998dae280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998dba2b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998de09a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c743a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c74490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca9fd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bc21f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998c5fb80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca3ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ddafd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bd4b20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998bd4e50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be6760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be6ca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b7e3d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998bd4f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b8f2b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998be65e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998b8f370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca1a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa6d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa9a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baa790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baa880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998baacd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998bb7220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baa910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998b9ea60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998ca15e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998baaac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f49985e76a0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985237f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998523160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523f40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523d60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998523fa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998523100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984fa100> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49983ff100> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49983ff2e0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49983ffc70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850adc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850a3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499850afa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499855ac70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998505d00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985053d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984d8b50> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49985054f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998505520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845b310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856c220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998467880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856c3a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499856cca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998467820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998504af0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499856c940> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499856c5b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985648e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845d970> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499847ad60> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984665e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f499845df10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984669d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f49984797f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984b4880> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff89a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984e0730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985263a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998498610> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997e89b50> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984ab6a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49984f7e50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998526850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff66d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997fda5e0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997fda670> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f499800a040> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ff6520> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73fa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73be0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4998506d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997fbae80> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f49985060d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997ddcfd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4998008e50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997d73e50> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cf0e50> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cf09d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997ceb790> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997ffb7f0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997cb0310> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997cb0340> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_9ldg90n5/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4997cd3c40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997c627f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4997c622e0> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "01", "epoch": "1726882681", "epoch_int": "1726882681", "date": "2024-09-20", "time": "21:38:01", "iso8601_micro": "2024-09-21T01:38:01.090379Z", "iso8601": "2024-09-21T01:38:01Z", "iso8601_basic": "20240920T213801090379", "iso8601_basic_short": "20240920T213801", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 25201 1726882681.15784: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882681.15787: _low_level_execute_command(): starting 25201 1726882681.15789: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882680.5761964-25285-156143968590636/ > /dev/null 2>&1 && sleep 0' 25201 1726882681.16679: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882681.16691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.16702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.16715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.16755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.16800: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882681.16811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.16824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882681.16896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882681.16900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882681.16908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.16916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.16927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.16935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.16941: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882681.16951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.17123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.17142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.17155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.17290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.19156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.19160: stdout chunk (state=3): >>><<< 25201 1726882681.19171: stderr chunk (state=3): >>><<< 25201 1726882681.19189: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882681.19196: handler run complete 25201 1726882681.19258: variable 'ansible_facts' from source: unknown 25201 1726882681.19313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.19425: variable 'ansible_facts' from source: unknown 25201 1726882681.19482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.19537: attempt loop complete, returning result 25201 1726882681.19546: _execute() done 25201 1726882681.19548: dumping result to json 25201 1726882681.19560: done dumping result, returning 25201 1726882681.19574: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-313b-197e-0000000000ca] 25201 1726882681.19577: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ca 25201 1726882681.19741: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ca 25201 1726882681.19744: WORKER PROCESS EXITING ok: [managed_node2] 25201 1726882681.19841: no more pending results, returning what we have 25201 1726882681.19844: results queue empty 25201 1726882681.19845: checking for any_errors_fatal 25201 1726882681.19846: done checking for any_errors_fatal 25201 1726882681.19847: checking for max_fail_percentage 25201 1726882681.19848: done checking for max_fail_percentage 25201 1726882681.19848: checking to see if all hosts have failed and the running result is not ok 25201 1726882681.19849: done checking to see if all hosts have failed 25201 1726882681.19850: getting the remaining hosts for this loop 25201 1726882681.19851: done getting the remaining hosts for this loop 25201 1726882681.19855: getting the next task for host managed_node2 25201 1726882681.19866: done getting next task for host managed_node2 25201 1726882681.19868: ^ task is: TASK: Check if system is ostree 25201 1726882681.19871: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882681.19874: getting variables 25201 1726882681.19876: in VariableManager get_vars() 25201 1726882681.19929: Calling all_inventory to load vars for managed_node2 25201 1726882681.19932: Calling groups_inventory to load vars for managed_node2 25201 1726882681.19936: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882681.19946: Calling all_plugins_play to load vars for managed_node2 25201 1726882681.19948: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882681.19951: Calling groups_plugins_play to load vars for managed_node2 25201 1726882681.20149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.20369: done with get_vars() 25201 1726882681.20379: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:38:01 -0400 (0:00:00.733) 0:00:02.379 ****** 25201 1726882681.20525: entering _queue_task() for managed_node2/stat 25201 1726882681.21538: worker is 1 (out of 1 available) 25201 1726882681.21550: exiting _queue_task() for managed_node2/stat 25201 1726882681.21561: done queuing things up, now waiting for results queue to drain 25201 1726882681.21562: waiting for pending results... 25201 1726882681.22417: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 25201 1726882681.22620: in run() - task 0e448fcc-3ce9-313b-197e-0000000000cc 25201 1726882681.22632: variable 'ansible_search_path' from source: unknown 25201 1726882681.22636: variable 'ansible_search_path' from source: unknown 25201 1726882681.22674: calling self._execute() 25201 1726882681.22865: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.22873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.22889: variable 'omit' from source: magic vars 25201 1726882681.23957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882681.24537: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882681.24700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882681.24733: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882681.24886: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882681.24962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882681.25100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882681.25132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882681.25304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882681.25451: Evaluated conditional (not __network_is_ostree is defined): True 25201 1726882681.25522: variable 'omit' from source: magic vars 25201 1726882681.25568: variable 'omit' from source: magic vars 25201 1726882681.25661: variable 'omit' from source: magic vars 25201 1726882681.25760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882681.25795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882681.25860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882681.25889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882681.25959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882681.25999: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882681.26059: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.26072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.26284: Set connection var ansible_shell_executable to /bin/sh 25201 1726882681.26300: Set connection var ansible_pipelining to False 25201 1726882681.26311: Set connection var ansible_connection to ssh 25201 1726882681.26321: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882681.26328: Set connection var ansible_shell_type to sh 25201 1726882681.26339: Set connection var ansible_timeout to 10 25201 1726882681.26367: variable 'ansible_shell_executable' from source: unknown 25201 1726882681.26411: variable 'ansible_connection' from source: unknown 25201 1726882681.26419: variable 'ansible_module_compression' from source: unknown 25201 1726882681.26427: variable 'ansible_shell_type' from source: unknown 25201 1726882681.26474: variable 'ansible_shell_executable' from source: unknown 25201 1726882681.26481: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.26494: variable 'ansible_pipelining' from source: unknown 25201 1726882681.26500: variable 'ansible_timeout' from source: unknown 25201 1726882681.26508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.26785: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882681.26852: variable 'omit' from source: magic vars 25201 1726882681.26875: starting attempt loop 25201 1726882681.26928: running the handler 25201 1726882681.26954: _low_level_execute_command(): starting 25201 1726882681.26970: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882681.28920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882681.28940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.28956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.28981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.29028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.29108: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882681.29123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.29147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882681.29161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882681.29179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882681.29192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.29211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.29228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.29242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.29259: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882681.29280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.29412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.29446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.29467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.29656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.31314: stdout chunk (state=3): >>>/root <<< 25201 1726882681.31500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.31503: stdout chunk (state=3): >>><<< 25201 1726882681.31507: stderr chunk (state=3): >>><<< 25201 1726882681.31627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882681.31638: _low_level_execute_command(): starting 25201 1726882681.31641: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401 `" && echo ansible-tmp-1726882681.315308-25326-245920767746401="` echo /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401 `" ) && sleep 0' 25201 1726882681.32557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.32560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.32599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.32604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.32606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.32690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.32693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.32809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.35022: stdout chunk (state=3): >>>ansible-tmp-1726882681.315308-25326-245920767746401=/root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401 <<< 25201 1726882681.35177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.35244: stderr chunk (state=3): >>><<< 25201 1726882681.35247: stdout chunk (state=3): >>><<< 25201 1726882681.35375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882681.315308-25326-245920767746401=/root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882681.35379: variable 'ansible_module_compression' from source: unknown 25201 1726882681.35486: ANSIBALLZ: Using lock for stat 25201 1726882681.35489: ANSIBALLZ: Acquiring lock 25201 1726882681.35491: ANSIBALLZ: Lock acquired: 140300039320160 25201 1726882681.35493: ANSIBALLZ: Creating module 25201 1726882681.55651: ANSIBALLZ: Writing module into payload 25201 1726882681.55887: ANSIBALLZ: Writing module 25201 1726882681.55918: ANSIBALLZ: Renaming module 25201 1726882681.55977: ANSIBALLZ: Done creating module 25201 1726882681.55999: variable 'ansible_facts' from source: unknown 25201 1726882681.56106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/AnsiballZ_stat.py 25201 1726882681.56860: Sending initial data 25201 1726882681.56866: Sent initial data (152 bytes) 25201 1726882681.58498: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.58502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.58535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.58538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.58540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.58620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.58623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.58629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.58752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882681.61024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882681.61116: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882681.61217: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmppzb6ivbz /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/AnsiballZ_stat.py <<< 25201 1726882681.61313: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882681.62902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.63069: stderr chunk (state=3): >>><<< 25201 1726882681.63073: stdout chunk (state=3): >>><<< 25201 1726882681.63075: done transferring module to remote 25201 1726882681.63078: _low_level_execute_command(): starting 25201 1726882681.63080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/ /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/AnsiballZ_stat.py && sleep 0' 25201 1726882681.63692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882681.63700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.63720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.63752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.63805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.63830: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882681.63832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.64288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882681.65033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882681.65046: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882681.65058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.65092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.65110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.65123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.65136: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882681.65155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.65242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.65262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.65280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.65412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.67240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.67243: stdout chunk (state=3): >>><<< 25201 1726882681.67245: stderr chunk (state=3): >>><<< 25201 1726882681.67345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882681.67348: _low_level_execute_command(): starting 25201 1726882681.67351: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/AnsiballZ_stat.py && sleep 0' 25201 1726882681.68625: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882681.68673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.68690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.68767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.68828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.68842: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882681.68866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.68899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882681.68951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882681.68968: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882681.68982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.68999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.69016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.69028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.69061: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882681.69113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.69297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.69347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.69369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.69531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.72255: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 25201 1726882681.72352: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 25201 1726882681.72383: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 25201 1726882681.72423: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 25201 1726882681.72506: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882681.72510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 25201 1726882681.72660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298ac0> <<< 25201 1726882681.72687: stdout chunk (state=3): >>>import '_signal' # <<< 25201 1726882681.72732: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 25201 1726882681.72792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d670> <<< 25201 1726882681.72844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 25201 1726882681.72875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 25201 1726882681.72917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 25201 1726882681.72967: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 25201 1726882681.73040: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf220> <<< 25201 1726882681.73080: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 25201 1726882681.73103: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ff2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf940> <<< 25201 1726882681.73147: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566255880> <<< 25201 1726882681.73161: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fc8d90> <<< 25201 1726882681.73216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 25201 1726882681.73229: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ff2d90> <<< 25201 1726882681.73274: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d970> <<< 25201 1726882681.73304: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25201 1726882681.74246: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565eefdf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eef8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eefee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eeffa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eefeb0> import '_collections' # <<< 25201 1726882681.74337: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f49d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f42640> <<< 25201 1726882681.74408: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f556a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f75df0> <<< 25201 1726882681.74438: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 25201 1726882681.74480: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565f02ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f49280> <<< 25201 1726882681.74521: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565f552b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f7b9a0> <<< 25201 1726882681.74565: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 25201 1726882681.74569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 25201 1726882681.74596: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882681.74668: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 25201 1726882681.74672: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02dc0> <<< 25201 1726882681.74679: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02d30> <<< 25201 1726882681.74708: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 25201 1726882681.74754: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 25201 1726882681.74757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 25201 1726882681.74782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 25201 1726882681.74841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 25201 1726882681.74877: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ed53a0> <<< 25201 1726882681.74904: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 25201 1726882681.74914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 25201 1726882681.74965: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ed5490> <<< 25201 1726882681.75144: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f0afd0> <<< 25201 1726882681.75195: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04a60> <<< 25201 1726882681.75494: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e091f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ec0b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f75fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 25201 1726882681.75496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 25201 1726882681.75530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 25201 1726882681.75545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 25201 1726882681.75556: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e1bb20> <<< 25201 1726882681.75584: stdout chunk (state=3): >>>import 'errno' # <<< 25201 1726882681.75633: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.75653: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.75658: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565e1be50> <<< 25201 1726882681.75688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 25201 1726882681.75714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 25201 1726882681.75748: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 25201 1726882681.75778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 25201 1726882681.75802: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2d760> <<< 25201 1726882681.75840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 25201 1726882681.75897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc'<<< 25201 1726882681.75904: stdout chunk (state=3): >>> <<< 25201 1726882681.75943: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2dca0> <<< 25201 1726882681.75990: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.76014: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.76022: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dc53d0> <<< 25201 1726882681.76052: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e1bf40> <<< 25201 1726882681.76088: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 25201 1726882681.76105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 25201 1726882681.76191: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.76221: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.76227: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dd62b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2d5e0><<< 25201 1726882681.76229: stdout chunk (state=3): >>> <<< 25201 1726882681.76240: stdout chunk (state=3): >>>import 'pwd' # <<< 25201 1726882681.76302: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 25201 1726882681.76306: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 25201 1726882681.76318: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dd6370> <<< 25201 1726882681.76380: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02a00><<< 25201 1726882681.76412: stdout chunk (state=3): >>> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 25201 1726882681.76454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 25201 1726882681.76482: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 25201 1726882681.76515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 25201 1726882681.76586: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.76649: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df16d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 25201 1726882681.76687: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df19a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1790> <<< 25201 1726882681.76700: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df1880> <<< 25201 1726882681.76724: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 25201 1726882681.76742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 25201 1726882681.77687: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df1cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dfe220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565de5a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f025e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35657e76a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 25201 1726882681.78811: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.79775: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d7f0> <<< 25201 1726882681.79892: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356570d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d280> <<< 25201 1726882681.79945: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570df40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 25201 1726882681.80042: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570dd60> import 'atexit' # <<< 25201 1726882681.80062: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356570dfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 25201 1726882681.80076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 25201 1726882681.80153: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 25201 1726882681.80178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 25201 1726882681.80206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 25201 1726882681.81017: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565664f10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565683f10> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565683d30> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656833a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565773dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657733a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565773fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565744c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656dfd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656df3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657154c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656df4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656df520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 25201 1726882681.81057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565645310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565755220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 25201 1726882681.81108: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565651880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657553a0> <<< 25201 1726882681.81200: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 25201 1726882681.81255: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356576ddc0> <<< 25201 1726882681.81382: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565651820> <<< 25201 1726882681.81474: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565651670> <<< 25201 1726882681.81502: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565650610> <<< 25201 1726882681.81584: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565650520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356574c8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 25201 1726882681.81606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 25201 1726882681.81658: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 25201 1726882681.81680: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d66a0> <<< 25201 1726882681.81951: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d4af0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656e40a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565719ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 25201 1726882681.82004: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.82083: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25201 1726882681.82123: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 25201 1726882681.82227: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.82324: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.82791: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.83237: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 25201 1726882681.83272: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 25201 1726882681.83275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882681.83333: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356522f5b0> <<< 25201 1726882681.83400: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 25201 1726882681.83417: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565620550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35651d10d0> <<< 25201 1726882681.83478: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 25201 1726882681.83507: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 25201 1726882681.83512: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.83628: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.83756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 25201 1726882681.83789: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656d4be0> # zipimport: zlib available <<< 25201 1726882681.84177: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84530: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84588: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84657: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 25201 1726882681.84662: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84686: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84717: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 25201 1726882681.84801: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.84998: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 25201 1726882681.85002: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 25201 1726882681.85144: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 25201 1726882681.85368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 25201 1726882681.85435: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356562e9a0> <<< 25201 1726882681.85449: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85498: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85574: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 25201 1726882681.85596: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85623: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85678: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 25201 1726882681.85682: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85719: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85744: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85839: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.85891: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 25201 1726882681.85920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 25201 1726882681.85991: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565760250> <<< 25201 1726882681.86017: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356562ef10> <<< 25201 1726882681.86055: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 25201 1726882681.86187: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.86229: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.86253: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.86310: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 25201 1726882681.86329: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 25201 1726882681.86342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 25201 1726882681.86359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 25201 1726882681.86393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 25201 1726882681.86396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 25201 1726882681.86475: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656137f0> <<< 25201 1726882681.86517: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356560f820> <<< 25201 1726882681.86572: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565609a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 25201 1726882681.86615: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.86634: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 25201 1726882681.86705: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 25201 1726882681.86727: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 25201 1726882681.86849: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.87011: stdout chunk (state=3): >>># zipimport: zlib available <<< 25201 1726882681.87192: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 25201 1726882681.87416: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 25201 1726882681.87429: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 25201 1726882681.87452: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg <<< 25201 1726882681.87525: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale <<< 25201 1726882681.87536: stdout chunk (state=3): >>># cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 25201 1726882681.87742: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 25201 1726882681.87798: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile <<< 25201 1726882681.87842: stdout chunk (state=3): >>># destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux <<< 25201 1726882681.87886: stdout chunk (state=3): >>># destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 25201 1726882681.88014: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 25201 1726882681.88074: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 25201 1726882681.88093: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 25201 1726882681.88251: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 25201 1726882681.88304: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 25201 1726882681.88318: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 25201 1726882681.88331: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 25201 1726882681.88736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882681.88739: stdout chunk (state=3): >>><<< 25201 1726882681.88742: stderr chunk (state=3): >>><<< 25201 1726882681.88841: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566298ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ff2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3566255880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565fc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ff2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356623d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565eefdf0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eef8e0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eefee0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eeffa0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565eefeb0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f49d60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f42640> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f556a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f75df0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565f02ca0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f49280> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565f552b0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f7b9a0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02fd0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02dc0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02d30> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ed53a0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ed5490> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f0afd0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04a60> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04580> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e091f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565ec0b80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f04ee0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f75fd0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e1bb20> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565e1be50> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2d760> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2dca0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dc53d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e1bf40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dd62b0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565e2d5e0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dd6370> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f02a00> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df16d0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df19a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1790> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df1880> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565df1cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565dfe220> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1910> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565de5a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565f025e0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565df1ac0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f35657e76a0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d7f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356570d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570df40> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570dd60> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356570dfa0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356570d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565664f10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565683f10> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565683d30> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656833a0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565773dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657733a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565773fa0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565744c70> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656dfd00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656df3d0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657154c0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656df4f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656df520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565645310> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565755220> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565651880> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35657553a0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356576ddc0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565651820> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565651670> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565650610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565650520> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356574c8e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d66a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d4af0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656e40a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f35656d6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565719ac0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f356522f5b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565620550> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35651d10d0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656d4be0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356562e9a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3565760250> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356562ef10> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f35656137f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f356560f820> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3565609a00> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_p9wnbq17/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 25201 1726882681.89595: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882681.89599: _low_level_execute_command(): starting 25201 1726882681.89601: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882681.315308-25326-245920767746401/ > /dev/null 2>&1 && sleep 0' 25201 1726882681.90257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882681.90273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.90287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.90303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.90355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.90373: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882681.90387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.90403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882681.90414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882681.90423: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882681.90433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882681.90449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882681.90475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882681.90487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882681.90497: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882681.90509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882681.90594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882681.90617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882681.90633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882681.90759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882681.92625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882681.92687: stderr chunk (state=3): >>><<< 25201 1726882681.92690: stdout chunk (state=3): >>><<< 25201 1726882681.92744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882681.92748: handler run complete 25201 1726882681.92750: attempt loop complete, returning result 25201 1726882681.92753: _execute() done 25201 1726882681.92755: dumping result to json 25201 1726882681.92757: done dumping result, returning 25201 1726882681.92759: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-313b-197e-0000000000cc] 25201 1726882681.92761: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cc 25201 1726882681.92825: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cc 25201 1726882681.92827: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 25201 1726882681.92890: no more pending results, returning what we have 25201 1726882681.92893: results queue empty 25201 1726882681.92894: checking for any_errors_fatal 25201 1726882681.92900: done checking for any_errors_fatal 25201 1726882681.92901: checking for max_fail_percentage 25201 1726882681.92902: done checking for max_fail_percentage 25201 1726882681.92903: checking to see if all hosts have failed and the running result is not ok 25201 1726882681.92904: done checking to see if all hosts have failed 25201 1726882681.92904: getting the remaining hosts for this loop 25201 1726882681.92906: done getting the remaining hosts for this loop 25201 1726882681.92909: getting the next task for host managed_node2 25201 1726882681.92916: done getting next task for host managed_node2 25201 1726882681.92917: ^ task is: TASK: Set flag to indicate system is ostree 25201 1726882681.92920: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882681.92923: getting variables 25201 1726882681.92925: in VariableManager get_vars() 25201 1726882681.92952: Calling all_inventory to load vars for managed_node2 25201 1726882681.92955: Calling groups_inventory to load vars for managed_node2 25201 1726882681.92959: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882681.92973: Calling all_plugins_play to load vars for managed_node2 25201 1726882681.92976: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882681.92979: Calling groups_plugins_play to load vars for managed_node2 25201 1726882681.93133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.93256: done with get_vars() 25201 1726882681.93267: done getting variables 25201 1726882681.93338: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:38:01 -0400 (0:00:00.728) 0:00:03.108 ****** 25201 1726882681.93366: entering _queue_task() for managed_node2/set_fact 25201 1726882681.93368: Creating lock for set_fact 25201 1726882681.93568: worker is 1 (out of 1 available) 25201 1726882681.93579: exiting _queue_task() for managed_node2/set_fact 25201 1726882681.93591: done queuing things up, now waiting for results queue to drain 25201 1726882681.93593: waiting for pending results... 25201 1726882681.93743: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 25201 1726882681.93810: in run() - task 0e448fcc-3ce9-313b-197e-0000000000cd 25201 1726882681.93820: variable 'ansible_search_path' from source: unknown 25201 1726882681.93823: variable 'ansible_search_path' from source: unknown 25201 1726882681.93850: calling self._execute() 25201 1726882681.93907: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.93912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.93922: variable 'omit' from source: magic vars 25201 1726882681.94327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882681.94515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882681.94553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882681.94871: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882681.94875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882681.94878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882681.94880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882681.94883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882681.94885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882681.94887: Evaluated conditional (not __network_is_ostree is defined): True 25201 1726882681.94889: variable 'omit' from source: magic vars 25201 1726882681.95169: variable 'omit' from source: magic vars 25201 1726882681.95172: variable '__ostree_booted_stat' from source: set_fact 25201 1726882681.95175: variable 'omit' from source: magic vars 25201 1726882681.95177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882681.95180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882681.95182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882681.95184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882681.95186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882681.95510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882681.95516: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.95519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.95522: Set connection var ansible_shell_executable to /bin/sh 25201 1726882681.95524: Set connection var ansible_pipelining to False 25201 1726882681.95525: Set connection var ansible_connection to ssh 25201 1726882681.95527: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882681.95529: Set connection var ansible_shell_type to sh 25201 1726882681.95531: Set connection var ansible_timeout to 10 25201 1726882681.95533: variable 'ansible_shell_executable' from source: unknown 25201 1726882681.95535: variable 'ansible_connection' from source: unknown 25201 1726882681.95537: variable 'ansible_module_compression' from source: unknown 25201 1726882681.95539: variable 'ansible_shell_type' from source: unknown 25201 1726882681.95541: variable 'ansible_shell_executable' from source: unknown 25201 1726882681.95543: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.95545: variable 'ansible_pipelining' from source: unknown 25201 1726882681.95547: variable 'ansible_timeout' from source: unknown 25201 1726882681.95549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.95551: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882681.95553: variable 'omit' from source: magic vars 25201 1726882681.95555: starting attempt loop 25201 1726882681.95557: running the handler 25201 1726882681.95559: handler run complete 25201 1726882681.95561: attempt loop complete, returning result 25201 1726882681.95567: _execute() done 25201 1726882681.95569: dumping result to json 25201 1726882681.95571: done dumping result, returning 25201 1726882681.95572: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-313b-197e-0000000000cd] 25201 1726882681.95574: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cd 25201 1726882681.95633: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cd 25201 1726882681.95636: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 25201 1726882681.95696: no more pending results, returning what we have 25201 1726882681.95698: results queue empty 25201 1726882681.95699: checking for any_errors_fatal 25201 1726882681.95704: done checking for any_errors_fatal 25201 1726882681.95705: checking for max_fail_percentage 25201 1726882681.95706: done checking for max_fail_percentage 25201 1726882681.95707: checking to see if all hosts have failed and the running result is not ok 25201 1726882681.95708: done checking to see if all hosts have failed 25201 1726882681.95708: getting the remaining hosts for this loop 25201 1726882681.95710: done getting the remaining hosts for this loop 25201 1726882681.95713: getting the next task for host managed_node2 25201 1726882681.95720: done getting next task for host managed_node2 25201 1726882681.95722: ^ task is: TASK: Fix CentOS6 Base repo 25201 1726882681.95724: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882681.95727: getting variables 25201 1726882681.95728: in VariableManager get_vars() 25201 1726882681.95778: Calling all_inventory to load vars for managed_node2 25201 1726882681.95781: Calling groups_inventory to load vars for managed_node2 25201 1726882681.95783: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882681.95790: Calling all_plugins_play to load vars for managed_node2 25201 1726882681.95791: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882681.95797: Calling groups_plugins_play to load vars for managed_node2 25201 1726882681.95938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.96111: done with get_vars() 25201 1726882681.96120: done getting variables 25201 1726882681.96226: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:38:01 -0400 (0:00:00.028) 0:00:03.137 ****** 25201 1726882681.96252: entering _queue_task() for managed_node2/copy 25201 1726882681.96481: worker is 1 (out of 1 available) 25201 1726882681.96492: exiting _queue_task() for managed_node2/copy 25201 1726882681.96503: done queuing things up, now waiting for results queue to drain 25201 1726882681.96504: waiting for pending results... 25201 1726882681.96731: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 25201 1726882681.96808: in run() - task 0e448fcc-3ce9-313b-197e-0000000000cf 25201 1726882681.96819: variable 'ansible_search_path' from source: unknown 25201 1726882681.96823: variable 'ansible_search_path' from source: unknown 25201 1726882681.96858: calling self._execute() 25201 1726882681.96923: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.96934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.96937: variable 'omit' from source: magic vars 25201 1726882681.97354: variable 'ansible_distribution' from source: facts 25201 1726882681.97375: Evaluated conditional (ansible_distribution == 'CentOS'): True 25201 1726882681.97456: variable 'ansible_distribution_major_version' from source: facts 25201 1726882681.97460: Evaluated conditional (ansible_distribution_major_version == '6'): False 25201 1726882681.97467: when evaluation is False, skipping this task 25201 1726882681.97469: _execute() done 25201 1726882681.97473: dumping result to json 25201 1726882681.97477: done dumping result, returning 25201 1726882681.97480: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-313b-197e-0000000000cf] 25201 1726882681.97489: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cf 25201 1726882681.97570: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000cf 25201 1726882681.97573: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25201 1726882681.97653: no more pending results, returning what we have 25201 1726882681.97655: results queue empty 25201 1726882681.97656: checking for any_errors_fatal 25201 1726882681.97660: done checking for any_errors_fatal 25201 1726882681.97661: checking for max_fail_percentage 25201 1726882681.97666: done checking for max_fail_percentage 25201 1726882681.97667: checking to see if all hosts have failed and the running result is not ok 25201 1726882681.97668: done checking to see if all hosts have failed 25201 1726882681.97669: getting the remaining hosts for this loop 25201 1726882681.97670: done getting the remaining hosts for this loop 25201 1726882681.97672: getting the next task for host managed_node2 25201 1726882681.97676: done getting next task for host managed_node2 25201 1726882681.97678: ^ task is: TASK: Include the task 'enable_epel.yml' 25201 1726882681.97680: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882681.97682: getting variables 25201 1726882681.97683: in VariableManager get_vars() 25201 1726882681.97702: Calling all_inventory to load vars for managed_node2 25201 1726882681.97705: Calling groups_inventory to load vars for managed_node2 25201 1726882681.97707: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882681.97714: Calling all_plugins_play to load vars for managed_node2 25201 1726882681.97716: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882681.97717: Calling groups_plugins_play to load vars for managed_node2 25201 1726882681.97820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882681.97951: done with get_vars() 25201 1726882681.97957: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:38:01 -0400 (0:00:00.017) 0:00:03.154 ****** 25201 1726882681.98018: entering _queue_task() for managed_node2/include_tasks 25201 1726882681.98182: worker is 1 (out of 1 available) 25201 1726882681.98195: exiting _queue_task() for managed_node2/include_tasks 25201 1726882681.98206: done queuing things up, now waiting for results queue to drain 25201 1726882681.98208: waiting for pending results... 25201 1726882681.98346: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 25201 1726882681.98410: in run() - task 0e448fcc-3ce9-313b-197e-0000000000d0 25201 1726882681.98420: variable 'ansible_search_path' from source: unknown 25201 1726882681.98424: variable 'ansible_search_path' from source: unknown 25201 1726882681.98449: calling self._execute() 25201 1726882681.98505: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882681.98509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882681.98517: variable 'omit' from source: magic vars 25201 1726882681.98845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882682.00746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882682.00795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882682.00821: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882682.00845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882682.00866: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882682.00925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882682.00944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882682.00963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882682.00997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882682.01009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882682.01095: variable '__network_is_ostree' from source: set_fact 25201 1726882682.01123: Evaluated conditional (not __network_is_ostree | d(false)): True 25201 1726882682.01127: _execute() done 25201 1726882682.01130: dumping result to json 25201 1726882682.01133: done dumping result, returning 25201 1726882682.01138: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-313b-197e-0000000000d0] 25201 1726882682.01144: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000d0 25201 1726882682.01225: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000d0 25201 1726882682.01227: WORKER PROCESS EXITING 25201 1726882682.01253: no more pending results, returning what we have 25201 1726882682.01257: in VariableManager get_vars() 25201 1726882682.01288: Calling all_inventory to load vars for managed_node2 25201 1726882682.01292: Calling groups_inventory to load vars for managed_node2 25201 1726882682.01295: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.01303: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.01305: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.01308: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.01430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.01540: done with get_vars() 25201 1726882682.01546: variable 'ansible_search_path' from source: unknown 25201 1726882682.01547: variable 'ansible_search_path' from source: unknown 25201 1726882682.01574: we have included files to process 25201 1726882682.01574: generating all_blocks data 25201 1726882682.01576: done generating all_blocks data 25201 1726882682.01580: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25201 1726882682.01581: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25201 1726882682.01582: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25201 1726882682.02038: done processing included file 25201 1726882682.02039: iterating over new_blocks loaded from include file 25201 1726882682.02040: in VariableManager get_vars() 25201 1726882682.02048: done with get_vars() 25201 1726882682.02049: filtering new block on tags 25201 1726882682.02066: done filtering new block on tags 25201 1726882682.02067: in VariableManager get_vars() 25201 1726882682.02074: done with get_vars() 25201 1726882682.02075: filtering new block on tags 25201 1726882682.02081: done filtering new block on tags 25201 1726882682.02083: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 25201 1726882682.02087: extending task lists for all hosts with included blocks 25201 1726882682.02147: done extending task lists 25201 1726882682.02148: done processing included files 25201 1726882682.02148: results queue empty 25201 1726882682.02149: checking for any_errors_fatal 25201 1726882682.02150: done checking for any_errors_fatal 25201 1726882682.02151: checking for max_fail_percentage 25201 1726882682.02152: done checking for max_fail_percentage 25201 1726882682.02152: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.02152: done checking to see if all hosts have failed 25201 1726882682.02153: getting the remaining hosts for this loop 25201 1726882682.02154: done getting the remaining hosts for this loop 25201 1726882682.02155: getting the next task for host managed_node2 25201 1726882682.02158: done getting next task for host managed_node2 25201 1726882682.02159: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 25201 1726882682.02161: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.02166: getting variables 25201 1726882682.02167: in VariableManager get_vars() 25201 1726882682.02172: Calling all_inventory to load vars for managed_node2 25201 1726882682.02173: Calling groups_inventory to load vars for managed_node2 25201 1726882682.02175: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.02178: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.02183: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.02185: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.02318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.02584: done with get_vars() 25201 1726882682.02592: done getting variables 25201 1726882682.02662: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 25201 1726882682.02875: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:38:02 -0400 (0:00:00.049) 0:00:03.203 ****** 25201 1726882682.02922: entering _queue_task() for managed_node2/command 25201 1726882682.02924: Creating lock for command 25201 1726882682.03240: worker is 1 (out of 1 available) 25201 1726882682.03253: exiting _queue_task() for managed_node2/command 25201 1726882682.03311: done queuing things up, now waiting for results queue to drain 25201 1726882682.03313: waiting for pending results... 25201 1726882682.03515: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 25201 1726882682.03592: in run() - task 0e448fcc-3ce9-313b-197e-0000000000ea 25201 1726882682.03596: variable 'ansible_search_path' from source: unknown 25201 1726882682.03599: variable 'ansible_search_path' from source: unknown 25201 1726882682.03624: calling self._execute() 25201 1726882682.03678: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.03682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.03690: variable 'omit' from source: magic vars 25201 1726882682.03943: variable 'ansible_distribution' from source: facts 25201 1726882682.03951: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25201 1726882682.04043: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.04047: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25201 1726882682.04050: when evaluation is False, skipping this task 25201 1726882682.04052: _execute() done 25201 1726882682.04055: dumping result to json 25201 1726882682.04058: done dumping result, returning 25201 1726882682.04068: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-313b-197e-0000000000ea] 25201 1726882682.04073: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ea 25201 1726882682.04160: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ea 25201 1726882682.04165: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25201 1726882682.04231: no more pending results, returning what we have 25201 1726882682.04234: results queue empty 25201 1726882682.04235: checking for any_errors_fatal 25201 1726882682.04236: done checking for any_errors_fatal 25201 1726882682.04237: checking for max_fail_percentage 25201 1726882682.04238: done checking for max_fail_percentage 25201 1726882682.04239: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.04240: done checking to see if all hosts have failed 25201 1726882682.04240: getting the remaining hosts for this loop 25201 1726882682.04241: done getting the remaining hosts for this loop 25201 1726882682.04244: getting the next task for host managed_node2 25201 1726882682.04249: done getting next task for host managed_node2 25201 1726882682.04251: ^ task is: TASK: Install yum-utils package 25201 1726882682.04253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.04255: getting variables 25201 1726882682.04256: in VariableManager get_vars() 25201 1726882682.04285: Calling all_inventory to load vars for managed_node2 25201 1726882682.04287: Calling groups_inventory to load vars for managed_node2 25201 1726882682.04289: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.04297: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.04299: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.04301: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.04401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.04513: done with get_vars() 25201 1726882682.04519: done getting variables 25201 1726882682.04584: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:38:02 -0400 (0:00:00.016) 0:00:03.220 ****** 25201 1726882682.04604: entering _queue_task() for managed_node2/package 25201 1726882682.04606: Creating lock for package 25201 1726882682.04766: worker is 1 (out of 1 available) 25201 1726882682.04779: exiting _queue_task() for managed_node2/package 25201 1726882682.04789: done queuing things up, now waiting for results queue to drain 25201 1726882682.04791: waiting for pending results... 25201 1726882682.04928: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 25201 1726882682.04993: in run() - task 0e448fcc-3ce9-313b-197e-0000000000eb 25201 1726882682.05003: variable 'ansible_search_path' from source: unknown 25201 1726882682.05006: variable 'ansible_search_path' from source: unknown 25201 1726882682.05030: calling self._execute() 25201 1726882682.05086: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.05089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.05098: variable 'omit' from source: magic vars 25201 1726882682.05856: variable 'ansible_distribution' from source: facts 25201 1726882682.05877: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25201 1726882682.06002: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.06013: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25201 1726882682.06021: when evaluation is False, skipping this task 25201 1726882682.06028: _execute() done 25201 1726882682.06034: dumping result to json 25201 1726882682.06040: done dumping result, returning 25201 1726882682.06049: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-313b-197e-0000000000eb] 25201 1726882682.06057: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000eb 25201 1726882682.06160: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000eb 25201 1726882682.06171: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25201 1726882682.06256: no more pending results, returning what we have 25201 1726882682.06259: results queue empty 25201 1726882682.06260: checking for any_errors_fatal 25201 1726882682.06268: done checking for any_errors_fatal 25201 1726882682.06269: checking for max_fail_percentage 25201 1726882682.06270: done checking for max_fail_percentage 25201 1726882682.06272: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.06272: done checking to see if all hosts have failed 25201 1726882682.06273: getting the remaining hosts for this loop 25201 1726882682.06275: done getting the remaining hosts for this loop 25201 1726882682.06278: getting the next task for host managed_node2 25201 1726882682.06283: done getting next task for host managed_node2 25201 1726882682.06285: ^ task is: TASK: Enable EPEL 7 25201 1726882682.06288: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.06291: getting variables 25201 1726882682.06292: in VariableManager get_vars() 25201 1726882682.06359: Calling all_inventory to load vars for managed_node2 25201 1726882682.06362: Calling groups_inventory to load vars for managed_node2 25201 1726882682.06368: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.06376: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.06379: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.06382: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.06544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.06747: done with get_vars() 25201 1726882682.06766: done getting variables 25201 1726882682.06822: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:38:02 -0400 (0:00:00.022) 0:00:03.243 ****** 25201 1726882682.06850: entering _queue_task() for managed_node2/command 25201 1726882682.07063: worker is 1 (out of 1 available) 25201 1726882682.07078: exiting _queue_task() for managed_node2/command 25201 1726882682.07097: done queuing things up, now waiting for results queue to drain 25201 1726882682.07099: waiting for pending results... 25201 1726882682.07343: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 25201 1726882682.07450: in run() - task 0e448fcc-3ce9-313b-197e-0000000000ec 25201 1726882682.07469: variable 'ansible_search_path' from source: unknown 25201 1726882682.07477: variable 'ansible_search_path' from source: unknown 25201 1726882682.07512: calling self._execute() 25201 1726882682.07590: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.07601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.07613: variable 'omit' from source: magic vars 25201 1726882682.07988: variable 'ansible_distribution' from source: facts 25201 1726882682.08004: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25201 1726882682.08141: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.08151: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25201 1726882682.08157: when evaluation is False, skipping this task 25201 1726882682.08165: _execute() done 25201 1726882682.08175: dumping result to json 25201 1726882682.08186: done dumping result, returning 25201 1726882682.08198: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-313b-197e-0000000000ec] 25201 1726882682.08207: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ec skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25201 1726882682.08341: no more pending results, returning what we have 25201 1726882682.08344: results queue empty 25201 1726882682.08346: checking for any_errors_fatal 25201 1726882682.08350: done checking for any_errors_fatal 25201 1726882682.08351: checking for max_fail_percentage 25201 1726882682.08352: done checking for max_fail_percentage 25201 1726882682.08353: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.08354: done checking to see if all hosts have failed 25201 1726882682.08354: getting the remaining hosts for this loop 25201 1726882682.08356: done getting the remaining hosts for this loop 25201 1726882682.08360: getting the next task for host managed_node2 25201 1726882682.08368: done getting next task for host managed_node2 25201 1726882682.08371: ^ task is: TASK: Enable EPEL 8 25201 1726882682.08375: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.08379: getting variables 25201 1726882682.08381: in VariableManager get_vars() 25201 1726882682.08409: Calling all_inventory to load vars for managed_node2 25201 1726882682.08412: Calling groups_inventory to load vars for managed_node2 25201 1726882682.08416: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.08428: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.08431: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.08434: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.08607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.08823: done with get_vars() 25201 1726882682.08832: done getting variables 25201 1726882682.08909: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:38:02 -0400 (0:00:00.020) 0:00:03.264 ****** 25201 1726882682.08940: entering _queue_task() for managed_node2/command 25201 1726882682.08957: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ec 25201 1726882682.08965: WORKER PROCESS EXITING 25201 1726882682.09340: worker is 1 (out of 1 available) 25201 1726882682.09351: exiting _queue_task() for managed_node2/command 25201 1726882682.09361: done queuing things up, now waiting for results queue to drain 25201 1726882682.09362: waiting for pending results... 25201 1726882682.09611: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 25201 1726882682.09726: in run() - task 0e448fcc-3ce9-313b-197e-0000000000ed 25201 1726882682.09746: variable 'ansible_search_path' from source: unknown 25201 1726882682.09752: variable 'ansible_search_path' from source: unknown 25201 1726882682.09801: calling self._execute() 25201 1726882682.09902: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.09932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.09947: variable 'omit' from source: magic vars 25201 1726882682.10620: variable 'ansible_distribution' from source: facts 25201 1726882682.10645: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25201 1726882682.10787: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.10800: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25201 1726882682.10808: when evaluation is False, skipping this task 25201 1726882682.10816: _execute() done 25201 1726882682.10822: dumping result to json 25201 1726882682.10829: done dumping result, returning 25201 1726882682.10847: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-313b-197e-0000000000ed] 25201 1726882682.10870: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ed 25201 1726882682.10980: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ed 25201 1726882682.10987: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25201 1726882682.11049: no more pending results, returning what we have 25201 1726882682.11052: results queue empty 25201 1726882682.11053: checking for any_errors_fatal 25201 1726882682.11059: done checking for any_errors_fatal 25201 1726882682.11060: checking for max_fail_percentage 25201 1726882682.11061: done checking for max_fail_percentage 25201 1726882682.11062: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.11065: done checking to see if all hosts have failed 25201 1726882682.11066: getting the remaining hosts for this loop 25201 1726882682.11067: done getting the remaining hosts for this loop 25201 1726882682.11072: getting the next task for host managed_node2 25201 1726882682.11083: done getting next task for host managed_node2 25201 1726882682.11085: ^ task is: TASK: Enable EPEL 6 25201 1726882682.11090: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.11093: getting variables 25201 1726882682.11094: in VariableManager get_vars() 25201 1726882682.11162: Calling all_inventory to load vars for managed_node2 25201 1726882682.11167: Calling groups_inventory to load vars for managed_node2 25201 1726882682.11171: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.11183: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.11186: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.11190: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.11391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.11653: done with get_vars() 25201 1726882682.11662: done getting variables 25201 1726882682.11753: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:38:02 -0400 (0:00:00.029) 0:00:03.293 ****** 25201 1726882682.11912: entering _queue_task() for managed_node2/copy 25201 1726882682.12359: worker is 1 (out of 1 available) 25201 1726882682.12372: exiting _queue_task() for managed_node2/copy 25201 1726882682.12424: done queuing things up, now waiting for results queue to drain 25201 1726882682.12426: waiting for pending results... 25201 1726882682.12784: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 25201 1726882682.12892: in run() - task 0e448fcc-3ce9-313b-197e-0000000000ef 25201 1726882682.12914: variable 'ansible_search_path' from source: unknown 25201 1726882682.12928: variable 'ansible_search_path' from source: unknown 25201 1726882682.12965: calling self._execute() 25201 1726882682.13044: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.13055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.13070: variable 'omit' from source: magic vars 25201 1726882682.13449: variable 'ansible_distribution' from source: facts 25201 1726882682.13476: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25201 1726882682.13585: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.13594: Evaluated conditional (ansible_distribution_major_version == '6'): False 25201 1726882682.13600: when evaluation is False, skipping this task 25201 1726882682.13606: _execute() done 25201 1726882682.13610: dumping result to json 25201 1726882682.13616: done dumping result, returning 25201 1726882682.13623: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-313b-197e-0000000000ef] 25201 1726882682.13632: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ef skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25201 1726882682.13765: no more pending results, returning what we have 25201 1726882682.13769: results queue empty 25201 1726882682.13770: checking for any_errors_fatal 25201 1726882682.13774: done checking for any_errors_fatal 25201 1726882682.13775: checking for max_fail_percentage 25201 1726882682.13776: done checking for max_fail_percentage 25201 1726882682.13777: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.13778: done checking to see if all hosts have failed 25201 1726882682.13779: getting the remaining hosts for this loop 25201 1726882682.13780: done getting the remaining hosts for this loop 25201 1726882682.13784: getting the next task for host managed_node2 25201 1726882682.13793: done getting next task for host managed_node2 25201 1726882682.13795: ^ task is: TASK: Set network provider to 'nm' 25201 1726882682.13798: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.13802: getting variables 25201 1726882682.13803: in VariableManager get_vars() 25201 1726882682.13830: Calling all_inventory to load vars for managed_node2 25201 1726882682.13833: Calling groups_inventory to load vars for managed_node2 25201 1726882682.13837: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.13848: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.13851: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.13854: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.14043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.14251: done with get_vars() 25201 1726882682.14260: done getting variables 25201 1726882682.14341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Friday 20 September 2024 21:38:02 -0400 (0:00:00.025) 0:00:03.318 ****** 25201 1726882682.14425: entering _queue_task() for managed_node2/set_fact 25201 1726882682.14437: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000ef 25201 1726882682.14440: WORKER PROCESS EXITING 25201 1726882682.14779: worker is 1 (out of 1 available) 25201 1726882682.14789: exiting _queue_task() for managed_node2/set_fact 25201 1726882682.14799: done queuing things up, now waiting for results queue to drain 25201 1726882682.14801: waiting for pending results... 25201 1726882682.15805: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 25201 1726882682.15889: in run() - task 0e448fcc-3ce9-313b-197e-000000000007 25201 1726882682.15909: variable 'ansible_search_path' from source: unknown 25201 1726882682.15949: calling self._execute() 25201 1726882682.16089: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.16100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.16117: variable 'omit' from source: magic vars 25201 1726882682.16218: variable 'omit' from source: magic vars 25201 1726882682.16251: variable 'omit' from source: magic vars 25201 1726882682.16297: variable 'omit' from source: magic vars 25201 1726882682.16345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882682.16393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882682.16416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882682.16437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882682.16454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882682.16501: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882682.16509: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.16516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.16627: Set connection var ansible_shell_executable to /bin/sh 25201 1726882682.16640: Set connection var ansible_pipelining to False 25201 1726882682.16649: Set connection var ansible_connection to ssh 25201 1726882682.16662: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882682.16671: Set connection var ansible_shell_type to sh 25201 1726882682.16684: Set connection var ansible_timeout to 10 25201 1726882682.16715: variable 'ansible_shell_executable' from source: unknown 25201 1726882682.16723: variable 'ansible_connection' from source: unknown 25201 1726882682.16729: variable 'ansible_module_compression' from source: unknown 25201 1726882682.16736: variable 'ansible_shell_type' from source: unknown 25201 1726882682.16741: variable 'ansible_shell_executable' from source: unknown 25201 1726882682.16747: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.16754: variable 'ansible_pipelining' from source: unknown 25201 1726882682.16759: variable 'ansible_timeout' from source: unknown 25201 1726882682.16771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.16917: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882682.16937: variable 'omit' from source: magic vars 25201 1726882682.16947: starting attempt loop 25201 1726882682.16954: running the handler 25201 1726882682.16971: handler run complete 25201 1726882682.16990: attempt loop complete, returning result 25201 1726882682.16997: _execute() done 25201 1726882682.17003: dumping result to json 25201 1726882682.17010: done dumping result, returning 25201 1726882682.17025: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-313b-197e-000000000007] 25201 1726882682.17040: sending task result for task 0e448fcc-3ce9-313b-197e-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 25201 1726882682.17175: no more pending results, returning what we have 25201 1726882682.17178: results queue empty 25201 1726882682.17179: checking for any_errors_fatal 25201 1726882682.17185: done checking for any_errors_fatal 25201 1726882682.17185: checking for max_fail_percentage 25201 1726882682.17187: done checking for max_fail_percentage 25201 1726882682.17187: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.17188: done checking to see if all hosts have failed 25201 1726882682.17189: getting the remaining hosts for this loop 25201 1726882682.17190: done getting the remaining hosts for this loop 25201 1726882682.17194: getting the next task for host managed_node2 25201 1726882682.17200: done getting next task for host managed_node2 25201 1726882682.17201: ^ task is: TASK: meta (flush_handlers) 25201 1726882682.17202: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.17206: getting variables 25201 1726882682.17207: in VariableManager get_vars() 25201 1726882682.17272: Calling all_inventory to load vars for managed_node2 25201 1726882682.17276: Calling groups_inventory to load vars for managed_node2 25201 1726882682.17279: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.17289: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.17292: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.17295: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.17454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.17678: done with get_vars() 25201 1726882682.17693: done getting variables 25201 1726882682.17783: in VariableManager get_vars() 25201 1726882682.17805: Calling all_inventory to load vars for managed_node2 25201 1726882682.17808: Calling groups_inventory to load vars for managed_node2 25201 1726882682.17811: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.17868: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000007 25201 1726882682.17872: WORKER PROCESS EXITING 25201 1726882682.17882: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.17885: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.17888: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.18189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.18427: done with get_vars() 25201 1726882682.18517: done queuing things up, now waiting for results queue to drain 25201 1726882682.18519: results queue empty 25201 1726882682.18520: checking for any_errors_fatal 25201 1726882682.18522: done checking for any_errors_fatal 25201 1726882682.18523: checking for max_fail_percentage 25201 1726882682.18524: done checking for max_fail_percentage 25201 1726882682.18524: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.18525: done checking to see if all hosts have failed 25201 1726882682.18526: getting the remaining hosts for this loop 25201 1726882682.18527: done getting the remaining hosts for this loop 25201 1726882682.18529: getting the next task for host managed_node2 25201 1726882682.18532: done getting next task for host managed_node2 25201 1726882682.18534: ^ task is: TASK: meta (flush_handlers) 25201 1726882682.18535: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.18543: getting variables 25201 1726882682.18544: in VariableManager get_vars() 25201 1726882682.18552: Calling all_inventory to load vars for managed_node2 25201 1726882682.18554: Calling groups_inventory to load vars for managed_node2 25201 1726882682.18556: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.18560: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.18563: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.18569: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.18738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.18937: done with get_vars() 25201 1726882682.18944: done getting variables 25201 1726882682.18988: in VariableManager get_vars() 25201 1726882682.18995: Calling all_inventory to load vars for managed_node2 25201 1726882682.18997: Calling groups_inventory to load vars for managed_node2 25201 1726882682.19000: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.19004: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.19006: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.19016: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.19159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.19330: done with get_vars() 25201 1726882682.19350: done queuing things up, now waiting for results queue to drain 25201 1726882682.19352: results queue empty 25201 1726882682.19353: checking for any_errors_fatal 25201 1726882682.19354: done checking for any_errors_fatal 25201 1726882682.19355: checking for max_fail_percentage 25201 1726882682.19356: done checking for max_fail_percentage 25201 1726882682.19357: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.19358: done checking to see if all hosts have failed 25201 1726882682.19358: getting the remaining hosts for this loop 25201 1726882682.19359: done getting the remaining hosts for this loop 25201 1726882682.19362: getting the next task for host managed_node2 25201 1726882682.19366: done getting next task for host managed_node2 25201 1726882682.19367: ^ task is: None 25201 1726882682.19368: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.19370: done queuing things up, now waiting for results queue to drain 25201 1726882682.19371: results queue empty 25201 1726882682.19371: checking for any_errors_fatal 25201 1726882682.19372: done checking for any_errors_fatal 25201 1726882682.19373: checking for max_fail_percentage 25201 1726882682.19374: done checking for max_fail_percentage 25201 1726882682.19375: checking to see if all hosts have failed and the running result is not ok 25201 1726882682.19375: done checking to see if all hosts have failed 25201 1726882682.19377: getting the next task for host managed_node2 25201 1726882682.19379: done getting next task for host managed_node2 25201 1726882682.19380: ^ task is: None 25201 1726882682.19381: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.19425: in VariableManager get_vars() 25201 1726882682.19455: done with get_vars() 25201 1726882682.19461: in VariableManager get_vars() 25201 1726882682.19478: done with get_vars() 25201 1726882682.19483: variable 'omit' from source: magic vars 25201 1726882682.19512: in VariableManager get_vars() 25201 1726882682.19526: done with get_vars() 25201 1726882682.19547: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 25201 1726882682.19954: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25201 1726882682.19981: getting the remaining hosts for this loop 25201 1726882682.19982: done getting the remaining hosts for this loop 25201 1726882682.19985: getting the next task for host managed_node2 25201 1726882682.19988: done getting next task for host managed_node2 25201 1726882682.19990: ^ task is: TASK: Gathering Facts 25201 1726882682.19992: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882682.19994: getting variables 25201 1726882682.19995: in VariableManager get_vars() 25201 1726882682.20010: Calling all_inventory to load vars for managed_node2 25201 1726882682.20012: Calling groups_inventory to load vars for managed_node2 25201 1726882682.20014: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882682.20019: Calling all_plugins_play to load vars for managed_node2 25201 1726882682.20032: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882682.20035: Calling groups_plugins_play to load vars for managed_node2 25201 1726882682.20226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882682.20442: done with get_vars() 25201 1726882682.20451: done getting variables 25201 1726882682.20499: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Friday 20 September 2024 21:38:02 -0400 (0:00:00.060) 0:00:03.379 ****** 25201 1726882682.20521: entering _queue_task() for managed_node2/gather_facts 25201 1726882682.20745: worker is 1 (out of 1 available) 25201 1726882682.20756: exiting _queue_task() for managed_node2/gather_facts 25201 1726882682.20768: done queuing things up, now waiting for results queue to drain 25201 1726882682.20769: waiting for pending results... 25201 1726882682.21010: running TaskExecutor() for managed_node2/TASK: Gathering Facts 25201 1726882682.21108: in run() - task 0e448fcc-3ce9-313b-197e-000000000115 25201 1726882682.21129: variable 'ansible_search_path' from source: unknown 25201 1726882682.21174: calling self._execute() 25201 1726882682.21262: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.21279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.21294: variable 'omit' from source: magic vars 25201 1726882682.21669: variable 'ansible_distribution_major_version' from source: facts 25201 1726882682.21693: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882682.21702: variable 'omit' from source: magic vars 25201 1726882682.21725: variable 'omit' from source: magic vars 25201 1726882682.21765: variable 'omit' from source: magic vars 25201 1726882682.21811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882682.21848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882682.21877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882682.21905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882682.21921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882682.21953: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882682.21961: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.21973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.22071: Set connection var ansible_shell_executable to /bin/sh 25201 1726882682.22086: Set connection var ansible_pipelining to False 25201 1726882682.22095: Set connection var ansible_connection to ssh 25201 1726882682.22104: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882682.22117: Set connection var ansible_shell_type to sh 25201 1726882682.22130: Set connection var ansible_timeout to 10 25201 1726882682.22154: variable 'ansible_shell_executable' from source: unknown 25201 1726882682.22163: variable 'ansible_connection' from source: unknown 25201 1726882682.22172: variable 'ansible_module_compression' from source: unknown 25201 1726882682.22179: variable 'ansible_shell_type' from source: unknown 25201 1726882682.22188: variable 'ansible_shell_executable' from source: unknown 25201 1726882682.22194: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882682.22199: variable 'ansible_pipelining' from source: unknown 25201 1726882682.22204: variable 'ansible_timeout' from source: unknown 25201 1726882682.22210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882682.22394: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882682.22414: variable 'omit' from source: magic vars 25201 1726882682.22422: starting attempt loop 25201 1726882682.22428: running the handler 25201 1726882682.22453: variable 'ansible_facts' from source: unknown 25201 1726882682.22479: _low_level_execute_command(): starting 25201 1726882682.22490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882682.23316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882682.23337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.23354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.23382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.23437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.23451: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882682.23470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.23496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882682.23510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882682.23523: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882682.23546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.23563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.23583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.23597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.23615: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882682.23632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.23728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882682.23761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882682.23786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882682.23940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882682.26180: stdout chunk (state=3): >>>/root <<< 25201 1726882682.26317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882682.26407: stderr chunk (state=3): >>><<< 25201 1726882682.26419: stdout chunk (state=3): >>><<< 25201 1726882682.26547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882682.26551: _low_level_execute_command(): starting 25201 1726882682.26554: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594 `" && echo ansible-tmp-1726882682.264506-25387-13198908038594="` echo /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594 `" ) && sleep 0' 25201 1726882682.27178: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882682.27193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.27213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.27229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.27274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.27287: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882682.27306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.27330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882682.27342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882682.27353: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882682.27369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.27383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.27398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.27409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.27426: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882682.27439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.27518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882682.27546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882682.27561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882682.27697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882682.30377: stdout chunk (state=3): >>>ansible-tmp-1726882682.264506-25387-13198908038594=/root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594 <<< 25201 1726882682.30536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882682.30617: stderr chunk (state=3): >>><<< 25201 1726882682.30626: stdout chunk (state=3): >>><<< 25201 1726882682.30669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882682.264506-25387-13198908038594=/root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882682.30975: variable 'ansible_module_compression' from source: unknown 25201 1726882682.30978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25201 1726882682.30980: variable 'ansible_facts' from source: unknown 25201 1726882682.30982: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/AnsiballZ_setup.py 25201 1726882682.31139: Sending initial data 25201 1726882682.31142: Sent initial data (152 bytes) 25201 1726882682.32094: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882682.32103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.32113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.32127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.32166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.32176: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882682.32187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.32200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882682.32208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882682.32214: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882682.32222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.32231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.32242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.32250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.32257: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882682.32274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.32343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882682.32372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882682.32390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882682.32524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882682.35253: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882682.35352: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882682.35458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpg7s6caqd /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/AnsiballZ_setup.py <<< 25201 1726882682.35554: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882682.37925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882682.38027: stderr chunk (state=3): >>><<< 25201 1726882682.38031: stdout chunk (state=3): >>><<< 25201 1726882682.38048: done transferring module to remote 25201 1726882682.38059: _low_level_execute_command(): starting 25201 1726882682.38067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/ /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/AnsiballZ_setup.py && sleep 0' 25201 1726882682.38539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.38544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.38547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.38549: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882682.38560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.38592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882682.38595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.38598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.38600: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.38674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882682.38695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882682.38711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882682.38843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882682.41394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882682.41435: stderr chunk (state=3): >>><<< 25201 1726882682.41439: stdout chunk (state=3): >>><<< 25201 1726882682.41482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882682.41486: _low_level_execute_command(): starting 25201 1726882682.41489: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/AnsiballZ_setup.py && sleep 0' 25201 1726882682.42101: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882682.42113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.42127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.42146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.42190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.42204: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882682.42227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.42246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882682.42259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882682.42274: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882682.42286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882682.42299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882682.42312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882682.42331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882682.42343: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882682.42355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882682.42438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882682.42459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882682.42478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882682.42620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.10501: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_loadavg": {"1m": 0.43, "5m": 0.4, "15m": 0.23}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Frid<<< 25201 1726882683.10527: stdout chunk (state=3): >>>ay", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "02", "epoch": "1726882682", "epoch_int": "1726882682", "date": "2024-09-20", "time": "21:38:02", "iso8601_micro": "2024-09-21T01:38:02.818716Z", "iso8601": "2024-09-21T01:38:02Z", "iso8601_basic": "20240920T213802818716", "iso8601_basic_short": "20240920T213802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.25<<< 25201 1726882683.10589: stdout chunk (state=3): >>>5", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2772, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 760, "free": 2772}, "nocache": {"free": 3235, "used": 297}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 621, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238383104, "block_size": 4096, "block_total": 65519355, "block_available": 64511324, "block_used": 1008031, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25201 1726882683.12887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882683.12916: stderr chunk (state=3): >>><<< 25201 1726882683.12920: stdout chunk (state=3): >>><<< 25201 1726882683.13274: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_loadavg": {"1m": 0.43, "5m": 0.4, "15m": 0.23}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "02", "epoch": "1726882682", "epoch_int": "1726882682", "date": "2024-09-20", "time": "21:38:02", "iso8601_micro": "2024-09-21T01:38:02.818716Z", "iso8601": "2024-09-21T01:38:02Z", "iso8601_basic": "20240920T213802818716", "iso8601_basic_short": "20240920T213802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2772, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 760, "free": 2772}, "nocache": {"free": 3235, "used": 297}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 621, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238383104, "block_size": 4096, "block_total": 65519355, "block_available": 64511324, "block_used": 1008031, "inode_total": 131071472, "inode_available": 130998690, "inode_used": 72782, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882683.13533: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882683.13589: _low_level_execute_command(): starting 25201 1726882683.13602: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882682.264506-25387-13198908038594/ > /dev/null 2>&1 && sleep 0' 25201 1726882683.14353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.14375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.14391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.14409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.14449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.14461: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.14479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.14495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.14506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.14515: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.14525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.14537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.14550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.14560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.14575: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882683.14590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.14666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.14692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882683.14707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.14918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.17384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.17451: stderr chunk (state=3): >>><<< 25201 1726882683.17454: stdout chunk (state=3): >>><<< 25201 1726882683.17770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882683.17774: handler run complete 25201 1726882683.17776: variable 'ansible_facts' from source: unknown 25201 1726882683.17778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.18023: variable 'ansible_facts' from source: unknown 25201 1726882683.18219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.18579: attempt loop complete, returning result 25201 1726882683.18646: _execute() done 25201 1726882683.18652: dumping result to json 25201 1726882683.18689: done dumping result, returning 25201 1726882683.18869: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-313b-197e-000000000115] 25201 1726882683.18879: sending task result for task 0e448fcc-3ce9-313b-197e-000000000115 ok: [managed_node2] 25201 1726882683.19896: no more pending results, returning what we have 25201 1726882683.19899: results queue empty 25201 1726882683.19900: checking for any_errors_fatal 25201 1726882683.19901: done checking for any_errors_fatal 25201 1726882683.19902: checking for max_fail_percentage 25201 1726882683.19903: done checking for max_fail_percentage 25201 1726882683.19904: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.19905: done checking to see if all hosts have failed 25201 1726882683.19906: getting the remaining hosts for this loop 25201 1726882683.19907: done getting the remaining hosts for this loop 25201 1726882683.19911: getting the next task for host managed_node2 25201 1726882683.19917: done getting next task for host managed_node2 25201 1726882683.19918: ^ task is: TASK: meta (flush_handlers) 25201 1726882683.19920: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.19924: getting variables 25201 1726882683.19925: in VariableManager get_vars() 25201 1726882683.19956: Calling all_inventory to load vars for managed_node2 25201 1726882683.19959: Calling groups_inventory to load vars for managed_node2 25201 1726882683.19962: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.19973: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.19976: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.19979: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.20126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.20587: done with get_vars() 25201 1726882683.20596: done getting variables 25201 1726882683.20890: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000115 25201 1726882683.20894: WORKER PROCESS EXITING 25201 1726882683.20936: in VariableManager get_vars() 25201 1726882683.20950: Calling all_inventory to load vars for managed_node2 25201 1726882683.20952: Calling groups_inventory to load vars for managed_node2 25201 1726882683.20954: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.20958: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.20960: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.20969: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.21115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.21420: done with get_vars() 25201 1726882683.21431: done queuing things up, now waiting for results queue to drain 25201 1726882683.21433: results queue empty 25201 1726882683.21434: checking for any_errors_fatal 25201 1726882683.21437: done checking for any_errors_fatal 25201 1726882683.21438: checking for max_fail_percentage 25201 1726882683.21438: done checking for max_fail_percentage 25201 1726882683.21439: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.21440: done checking to see if all hosts have failed 25201 1726882683.21440: getting the remaining hosts for this loop 25201 1726882683.21441: done getting the remaining hosts for this loop 25201 1726882683.21443: getting the next task for host managed_node2 25201 1726882683.21447: done getting next task for host managed_node2 25201 1726882683.21449: ^ task is: TASK: Include the task 'show_interfaces.yml' 25201 1726882683.21453: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.21455: getting variables 25201 1726882683.21456: in VariableManager get_vars() 25201 1726882683.21472: Calling all_inventory to load vars for managed_node2 25201 1726882683.21475: Calling groups_inventory to load vars for managed_node2 25201 1726882683.21477: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.21481: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.21598: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.21602: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.21841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.22234: done with get_vars() 25201 1726882683.22242: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Friday 20 September 2024 21:38:03 -0400 (0:00:01.017) 0:00:04.397 ****** 25201 1726882683.22320: entering _queue_task() for managed_node2/include_tasks 25201 1726882683.22735: worker is 1 (out of 1 available) 25201 1726882683.22747: exiting _queue_task() for managed_node2/include_tasks 25201 1726882683.22760: done queuing things up, now waiting for results queue to drain 25201 1726882683.22762: waiting for pending results... 25201 1726882683.23683: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25201 1726882683.24104: in run() - task 0e448fcc-3ce9-313b-197e-00000000000b 25201 1726882683.24116: variable 'ansible_search_path' from source: unknown 25201 1726882683.24294: calling self._execute() 25201 1726882683.24607: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.24679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.24699: variable 'omit' from source: magic vars 25201 1726882683.25750: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.25774: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.25788: _execute() done 25201 1726882683.25795: dumping result to json 25201 1726882683.25802: done dumping result, returning 25201 1726882683.25810: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-313b-197e-00000000000b] 25201 1726882683.25818: sending task result for task 0e448fcc-3ce9-313b-197e-00000000000b 25201 1726882683.25939: no more pending results, returning what we have 25201 1726882683.25944: in VariableManager get_vars() 25201 1726882683.25994: Calling all_inventory to load vars for managed_node2 25201 1726882683.25997: Calling groups_inventory to load vars for managed_node2 25201 1726882683.25999: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.26012: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.26015: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.26018: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.26236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.26440: done with get_vars() 25201 1726882683.26447: variable 'ansible_search_path' from source: unknown 25201 1726882683.26458: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000000b 25201 1726882683.26461: WORKER PROCESS EXITING 25201 1726882683.26469: we have included files to process 25201 1726882683.26470: generating all_blocks data 25201 1726882683.26472: done generating all_blocks data 25201 1726882683.26473: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882683.26474: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882683.26477: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882683.26688: in VariableManager get_vars() 25201 1726882683.26714: done with get_vars() 25201 1726882683.26823: done processing included file 25201 1726882683.26825: iterating over new_blocks loaded from include file 25201 1726882683.26826: in VariableManager get_vars() 25201 1726882683.26843: done with get_vars() 25201 1726882683.26844: filtering new block on tags 25201 1726882683.26861: done filtering new block on tags 25201 1726882683.26865: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25201 1726882683.26934: extending task lists for all hosts with included blocks 25201 1726882683.27001: done extending task lists 25201 1726882683.27002: done processing included files 25201 1726882683.27003: results queue empty 25201 1726882683.27004: checking for any_errors_fatal 25201 1726882683.27005: done checking for any_errors_fatal 25201 1726882683.27006: checking for max_fail_percentage 25201 1726882683.27007: done checking for max_fail_percentage 25201 1726882683.27008: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.27008: done checking to see if all hosts have failed 25201 1726882683.27009: getting the remaining hosts for this loop 25201 1726882683.27010: done getting the remaining hosts for this loop 25201 1726882683.27012: getting the next task for host managed_node2 25201 1726882683.27016: done getting next task for host managed_node2 25201 1726882683.27018: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25201 1726882683.27020: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.27022: getting variables 25201 1726882683.27023: in VariableManager get_vars() 25201 1726882683.27084: Calling all_inventory to load vars for managed_node2 25201 1726882683.27087: Calling groups_inventory to load vars for managed_node2 25201 1726882683.27089: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.27094: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.27096: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.27099: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.27239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.27433: done with get_vars() 25201 1726882683.27442: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:38:03 -0400 (0:00:00.051) 0:00:04.449 ****** 25201 1726882683.27518: entering _queue_task() for managed_node2/include_tasks 25201 1726882683.27755: worker is 1 (out of 1 available) 25201 1726882683.27766: exiting _queue_task() for managed_node2/include_tasks 25201 1726882683.27777: done queuing things up, now waiting for results queue to drain 25201 1726882683.27779: waiting for pending results... 25201 1726882683.28036: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25201 1726882683.28134: in run() - task 0e448fcc-3ce9-313b-197e-00000000012b 25201 1726882683.28151: variable 'ansible_search_path' from source: unknown 25201 1726882683.28158: variable 'ansible_search_path' from source: unknown 25201 1726882683.28197: calling self._execute() 25201 1726882683.28278: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.28289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.28301: variable 'omit' from source: magic vars 25201 1726882683.28715: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.28731: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.28741: _execute() done 25201 1726882683.28747: dumping result to json 25201 1726882683.28755: done dumping result, returning 25201 1726882683.28768: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-313b-197e-00000000012b] 25201 1726882683.28782: sending task result for task 0e448fcc-3ce9-313b-197e-00000000012b 25201 1726882683.28890: no more pending results, returning what we have 25201 1726882683.28895: in VariableManager get_vars() 25201 1726882683.28938: Calling all_inventory to load vars for managed_node2 25201 1726882683.28940: Calling groups_inventory to load vars for managed_node2 25201 1726882683.28943: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.28956: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.28959: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.28962: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.29187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.29399: done with get_vars() 25201 1726882683.29405: variable 'ansible_search_path' from source: unknown 25201 1726882683.29406: variable 'ansible_search_path' from source: unknown 25201 1726882683.29444: we have included files to process 25201 1726882683.29445: generating all_blocks data 25201 1726882683.29447: done generating all_blocks data 25201 1726882683.29448: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882683.29449: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882683.29451: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882683.29814: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000012b 25201 1726882683.29817: WORKER PROCESS EXITING 25201 1726882683.29991: done processing included file 25201 1726882683.29993: iterating over new_blocks loaded from include file 25201 1726882683.29995: in VariableManager get_vars() 25201 1726882683.30011: done with get_vars() 25201 1726882683.30013: filtering new block on tags 25201 1726882683.30029: done filtering new block on tags 25201 1726882683.30031: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25201 1726882683.30035: extending task lists for all hosts with included blocks 25201 1726882683.30141: done extending task lists 25201 1726882683.30142: done processing included files 25201 1726882683.30143: results queue empty 25201 1726882683.30143: checking for any_errors_fatal 25201 1726882683.30146: done checking for any_errors_fatal 25201 1726882683.30147: checking for max_fail_percentage 25201 1726882683.30148: done checking for max_fail_percentage 25201 1726882683.30149: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.30150: done checking to see if all hosts have failed 25201 1726882683.30150: getting the remaining hosts for this loop 25201 1726882683.30151: done getting the remaining hosts for this loop 25201 1726882683.30158: getting the next task for host managed_node2 25201 1726882683.30167: done getting next task for host managed_node2 25201 1726882683.30169: ^ task is: TASK: Gather current interface info 25201 1726882683.30172: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.30174: getting variables 25201 1726882683.30175: in VariableManager get_vars() 25201 1726882683.30186: Calling all_inventory to load vars for managed_node2 25201 1726882683.30189: Calling groups_inventory to load vars for managed_node2 25201 1726882683.30191: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.30195: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.30197: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.30200: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.30360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.30783: done with get_vars() 25201 1726882683.30792: done getting variables 25201 1726882683.30890: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:38:03 -0400 (0:00:00.033) 0:00:04.483 ****** 25201 1726882683.30916: entering _queue_task() for managed_node2/command 25201 1726882683.31473: worker is 1 (out of 1 available) 25201 1726882683.31485: exiting _queue_task() for managed_node2/command 25201 1726882683.31494: done queuing things up, now waiting for results queue to drain 25201 1726882683.31496: waiting for pending results... 25201 1726882683.31851: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25201 1726882683.31952: in run() - task 0e448fcc-3ce9-313b-197e-00000000013a 25201 1726882683.31974: variable 'ansible_search_path' from source: unknown 25201 1726882683.31981: variable 'ansible_search_path' from source: unknown 25201 1726882683.32022: calling self._execute() 25201 1726882683.32103: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.32114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.32131: variable 'omit' from source: magic vars 25201 1726882683.32503: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.32522: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.32532: variable 'omit' from source: magic vars 25201 1726882683.32583: variable 'omit' from source: magic vars 25201 1726882683.32623: variable 'omit' from source: magic vars 25201 1726882683.32672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882683.32716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882683.32738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882683.32759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.32783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.32819: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882683.32828: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.32835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.32943: Set connection var ansible_shell_executable to /bin/sh 25201 1726882683.32953: Set connection var ansible_pipelining to False 25201 1726882683.32967: Set connection var ansible_connection to ssh 25201 1726882683.32978: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882683.32984: Set connection var ansible_shell_type to sh 25201 1726882683.33000: Set connection var ansible_timeout to 10 25201 1726882683.33027: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.33034: variable 'ansible_connection' from source: unknown 25201 1726882683.33041: variable 'ansible_module_compression' from source: unknown 25201 1726882683.33046: variable 'ansible_shell_type' from source: unknown 25201 1726882683.33052: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.33058: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.33069: variable 'ansible_pipelining' from source: unknown 25201 1726882683.33077: variable 'ansible_timeout' from source: unknown 25201 1726882683.33084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.33225: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882683.33244: variable 'omit' from source: magic vars 25201 1726882683.33253: starting attempt loop 25201 1726882683.33258: running the handler 25201 1726882683.33280: _low_level_execute_command(): starting 25201 1726882683.33291: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882683.34141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.34159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.34181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.34204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.34250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.34267: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.34283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.34302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.34318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.34331: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.34342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.34354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.34372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.34384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.34395: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882683.34408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.34494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.34516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882683.34535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.34683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.36942: stdout chunk (state=3): >>>/root <<< 25201 1726882683.37153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.37156: stdout chunk (state=3): >>><<< 25201 1726882683.37166: stderr chunk (state=3): >>><<< 25201 1726882683.37271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882683.37275: _low_level_execute_command(): starting 25201 1726882683.37278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527 `" && echo ansible-tmp-1726882683.3718464-25418-32714153881527="` echo /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527 `" ) && sleep 0' 25201 1726882683.37873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.37888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.37908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.37929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.37978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.37990: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.38002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.38018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.38028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.38038: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.38048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.38070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.38086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.38096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.38106: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882683.38118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.38200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.38223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882683.38239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.38377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.41098: stdout chunk (state=3): >>>ansible-tmp-1726882683.3718464-25418-32714153881527=/root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527 <<< 25201 1726882683.41279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.41349: stderr chunk (state=3): >>><<< 25201 1726882683.41353: stdout chunk (state=3): >>><<< 25201 1726882683.41470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882683.3718464-25418-32714153881527=/root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882683.41474: variable 'ansible_module_compression' from source: unknown 25201 1726882683.41477: ANSIBALLZ: Using generic lock for ansible.legacy.command 25201 1726882683.41479: ANSIBALLZ: Acquiring lock 25201 1726882683.41481: ANSIBALLZ: Lock acquired: 140300039193808 25201 1726882683.41483: ANSIBALLZ: Creating module 25201 1726882683.55777: ANSIBALLZ: Writing module into payload 25201 1726882683.55892: ANSIBALLZ: Writing module 25201 1726882683.55917: ANSIBALLZ: Renaming module 25201 1726882683.55928: ANSIBALLZ: Done creating module 25201 1726882683.55949: variable 'ansible_facts' from source: unknown 25201 1726882683.56028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/AnsiballZ_command.py 25201 1726882683.56182: Sending initial data 25201 1726882683.56185: Sent initial data (155 bytes) 25201 1726882683.57146: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.57161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.57179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.57200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.57244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.57257: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.57279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.57299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.57311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.57323: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.57336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.57349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.57370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.57384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.57396: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882683.57410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.57487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.57511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882683.57529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.57675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.60141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882683.60241: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882683.60352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpudl46wxs /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/AnsiballZ_command.py <<< 25201 1726882683.60458: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882683.61948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.62147: stderr chunk (state=3): >>><<< 25201 1726882683.62151: stdout chunk (state=3): >>><<< 25201 1726882683.62153: done transferring module to remote 25201 1726882683.62156: _low_level_execute_command(): starting 25201 1726882683.62158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/ /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/AnsiballZ_command.py && sleep 0' 25201 1726882683.62847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.62877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.62898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.62933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.63013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.63047: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.63060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.63081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.63094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.63105: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.63126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.63168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.63172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.63222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.63228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.63338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.65741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.65801: stderr chunk (state=3): >>><<< 25201 1726882683.65808: stdout chunk (state=3): >>><<< 25201 1726882683.65839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882683.65842: _low_level_execute_command(): starting 25201 1726882683.65845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/AnsiballZ_command.py && sleep 0' 25201 1726882683.66270: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.66274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.66310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.66313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.66315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.66365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.66369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.66495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.85167: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:03.845112", "end": "2024-09-20 21:38:03.849102", "delta": "0:00:00.003990", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882683.86786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882683.86839: stderr chunk (state=3): >>><<< 25201 1726882683.86842: stdout chunk (state=3): >>><<< 25201 1726882683.86973: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:03.845112", "end": "2024-09-20 21:38:03.849102", "delta": "0:00:00.003990", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882683.86980: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882683.86983: _low_level_execute_command(): starting 25201 1726882683.86985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882683.3718464-25418-32714153881527/ > /dev/null 2>&1 && sleep 0' 25201 1726882683.87620: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882683.87636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.87657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.87680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.87720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.87736: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882683.87760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.87784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882683.87798: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882683.87810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882683.87823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882683.87837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882683.87857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882683.87882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882683.87893: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882683.87905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882683.88002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882683.88004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882683.88094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882683.90647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882683.90695: stderr chunk (state=3): >>><<< 25201 1726882683.90698: stdout chunk (state=3): >>><<< 25201 1726882683.90714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882683.90742: handler run complete 25201 1726882683.90752: Evaluated conditional (False): False 25201 1726882683.90762: attempt loop complete, returning result 25201 1726882683.90767: _execute() done 25201 1726882683.90770: dumping result to json 25201 1726882683.90779: done dumping result, returning 25201 1726882683.90787: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-313b-197e-00000000013a] 25201 1726882683.90790: sending task result for task 0e448fcc-3ce9-313b-197e-00000000013a 25201 1726882683.90889: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000013a 25201 1726882683.90892: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003990", "end": "2024-09-20 21:38:03.849102", "rc": 0, "start": "2024-09-20 21:38:03.845112" } STDOUT: bonding_masters eth0 lo 25201 1726882683.90976: no more pending results, returning what we have 25201 1726882683.90979: results queue empty 25201 1726882683.90980: checking for any_errors_fatal 25201 1726882683.90981: done checking for any_errors_fatal 25201 1726882683.90982: checking for max_fail_percentage 25201 1726882683.90984: done checking for max_fail_percentage 25201 1726882683.90984: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.90985: done checking to see if all hosts have failed 25201 1726882683.90986: getting the remaining hosts for this loop 25201 1726882683.90987: done getting the remaining hosts for this loop 25201 1726882683.90993: getting the next task for host managed_node2 25201 1726882683.90999: done getting next task for host managed_node2 25201 1726882683.91003: ^ task is: TASK: Set current_interfaces 25201 1726882683.91006: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.91010: getting variables 25201 1726882683.91011: in VariableManager get_vars() 25201 1726882683.91049: Calling all_inventory to load vars for managed_node2 25201 1726882683.91052: Calling groups_inventory to load vars for managed_node2 25201 1726882683.91054: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.91066: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.91069: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.91073: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.91234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.91443: done with get_vars() 25201 1726882683.91453: done getting variables 25201 1726882683.91517: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:38:03 -0400 (0:00:00.606) 0:00:05.090 ****** 25201 1726882683.91544: entering _queue_task() for managed_node2/set_fact 25201 1726882683.91797: worker is 1 (out of 1 available) 25201 1726882683.91809: exiting _queue_task() for managed_node2/set_fact 25201 1726882683.91822: done queuing things up, now waiting for results queue to drain 25201 1726882683.91824: waiting for pending results... 25201 1726882683.92072: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25201 1726882683.92139: in run() - task 0e448fcc-3ce9-313b-197e-00000000013b 25201 1726882683.92159: variable 'ansible_search_path' from source: unknown 25201 1726882683.92166: variable 'ansible_search_path' from source: unknown 25201 1726882683.92193: calling self._execute() 25201 1726882683.92269: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.92276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.92283: variable 'omit' from source: magic vars 25201 1726882683.92647: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.92675: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.92688: variable 'omit' from source: magic vars 25201 1726882683.92753: variable 'omit' from source: magic vars 25201 1726882683.92879: variable '_current_interfaces' from source: set_fact 25201 1726882683.92955: variable 'omit' from source: magic vars 25201 1726882683.93002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882683.93118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882683.93155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882683.93183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.93200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.93238: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882683.93256: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.93269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.93372: Set connection var ansible_shell_executable to /bin/sh 25201 1726882683.93397: Set connection var ansible_pipelining to False 25201 1726882683.93442: Set connection var ansible_connection to ssh 25201 1726882683.93445: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882683.93448: Set connection var ansible_shell_type to sh 25201 1726882683.93465: Set connection var ansible_timeout to 10 25201 1726882683.93502: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.93520: variable 'ansible_connection' from source: unknown 25201 1726882683.93534: variable 'ansible_module_compression' from source: unknown 25201 1726882683.93549: variable 'ansible_shell_type' from source: unknown 25201 1726882683.93570: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.93579: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.93590: variable 'ansible_pipelining' from source: unknown 25201 1726882683.93598: variable 'ansible_timeout' from source: unknown 25201 1726882683.93607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.93741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882683.93749: variable 'omit' from source: magic vars 25201 1726882683.93753: starting attempt loop 25201 1726882683.93756: running the handler 25201 1726882683.93768: handler run complete 25201 1726882683.93776: attempt loop complete, returning result 25201 1726882683.93778: _execute() done 25201 1726882683.93781: dumping result to json 25201 1726882683.93783: done dumping result, returning 25201 1726882683.93797: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-313b-197e-00000000013b] 25201 1726882683.93799: sending task result for task 0e448fcc-3ce9-313b-197e-00000000013b 25201 1726882683.93877: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000013b 25201 1726882683.93879: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25201 1726882683.93951: no more pending results, returning what we have 25201 1726882683.93953: results queue empty 25201 1726882683.93954: checking for any_errors_fatal 25201 1726882683.93961: done checking for any_errors_fatal 25201 1726882683.93962: checking for max_fail_percentage 25201 1726882683.93965: done checking for max_fail_percentage 25201 1726882683.93966: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.93967: done checking to see if all hosts have failed 25201 1726882683.93967: getting the remaining hosts for this loop 25201 1726882683.93968: done getting the remaining hosts for this loop 25201 1726882683.93972: getting the next task for host managed_node2 25201 1726882683.93978: done getting next task for host managed_node2 25201 1726882683.93980: ^ task is: TASK: Show current_interfaces 25201 1726882683.93982: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.93985: getting variables 25201 1726882683.93986: in VariableManager get_vars() 25201 1726882683.94019: Calling all_inventory to load vars for managed_node2 25201 1726882683.94022: Calling groups_inventory to load vars for managed_node2 25201 1726882683.94024: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.94031: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.94032: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.94034: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.94168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.94281: done with get_vars() 25201 1726882683.94287: done getting variables 25201 1726882683.94351: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:38:03 -0400 (0:00:00.028) 0:00:05.118 ****** 25201 1726882683.94372: entering _queue_task() for managed_node2/debug 25201 1726882683.94373: Creating lock for debug 25201 1726882683.94543: worker is 1 (out of 1 available) 25201 1726882683.94555: exiting _queue_task() for managed_node2/debug 25201 1726882683.94566: done queuing things up, now waiting for results queue to drain 25201 1726882683.94568: waiting for pending results... 25201 1726882683.94712: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25201 1726882683.94766: in run() - task 0e448fcc-3ce9-313b-197e-00000000012c 25201 1726882683.94777: variable 'ansible_search_path' from source: unknown 25201 1726882683.94780: variable 'ansible_search_path' from source: unknown 25201 1726882683.94812: calling self._execute() 25201 1726882683.94871: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.94876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.94885: variable 'omit' from source: magic vars 25201 1726882683.95169: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.95186: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.95195: variable 'omit' from source: magic vars 25201 1726882683.95232: variable 'omit' from source: magic vars 25201 1726882683.95331: variable 'current_interfaces' from source: set_fact 25201 1726882683.95359: variable 'omit' from source: magic vars 25201 1726882683.95401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882683.95439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882683.95461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882683.95489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.95505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882683.95536: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882683.95544: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.95552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.95650: Set connection var ansible_shell_executable to /bin/sh 25201 1726882683.95660: Set connection var ansible_pipelining to False 25201 1726882683.95673: Set connection var ansible_connection to ssh 25201 1726882683.95682: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882683.95688: Set connection var ansible_shell_type to sh 25201 1726882683.95699: Set connection var ansible_timeout to 10 25201 1726882683.95721: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.95728: variable 'ansible_connection' from source: unknown 25201 1726882683.95734: variable 'ansible_module_compression' from source: unknown 25201 1726882683.95739: variable 'ansible_shell_type' from source: unknown 25201 1726882683.95745: variable 'ansible_shell_executable' from source: unknown 25201 1726882683.95750: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.95757: variable 'ansible_pipelining' from source: unknown 25201 1726882683.95767: variable 'ansible_timeout' from source: unknown 25201 1726882683.95775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.95996: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882683.96010: variable 'omit' from source: magic vars 25201 1726882683.96019: starting attempt loop 25201 1726882683.96025: running the handler 25201 1726882683.96074: handler run complete 25201 1726882683.96091: attempt loop complete, returning result 25201 1726882683.96096: _execute() done 25201 1726882683.96102: dumping result to json 25201 1726882683.96108: done dumping result, returning 25201 1726882683.96117: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-313b-197e-00000000012c] 25201 1726882683.96128: sending task result for task 0e448fcc-3ce9-313b-197e-00000000012c 25201 1726882683.96207: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000012c 25201 1726882683.96211: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25201 1726882683.96378: no more pending results, returning what we have 25201 1726882683.96381: results queue empty 25201 1726882683.96382: checking for any_errors_fatal 25201 1726882683.96386: done checking for any_errors_fatal 25201 1726882683.96387: checking for max_fail_percentage 25201 1726882683.96389: done checking for max_fail_percentage 25201 1726882683.96390: checking to see if all hosts have failed and the running result is not ok 25201 1726882683.96391: done checking to see if all hosts have failed 25201 1726882683.96391: getting the remaining hosts for this loop 25201 1726882683.96393: done getting the remaining hosts for this loop 25201 1726882683.96396: getting the next task for host managed_node2 25201 1726882683.96403: done getting next task for host managed_node2 25201 1726882683.96405: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25201 1726882683.96407: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882683.96410: getting variables 25201 1726882683.96412: in VariableManager get_vars() 25201 1726882683.96452: Calling all_inventory to load vars for managed_node2 25201 1726882683.96456: Calling groups_inventory to load vars for managed_node2 25201 1726882683.96458: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.96468: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.96471: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.96474: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.96642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.97091: done with get_vars() 25201 1726882683.97106: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Friday 20 September 2024 21:38:03 -0400 (0:00:00.028) 0:00:05.146 ****** 25201 1726882683.97193: entering _queue_task() for managed_node2/include_tasks 25201 1726882683.97412: worker is 1 (out of 1 available) 25201 1726882683.97422: exiting _queue_task() for managed_node2/include_tasks 25201 1726882683.97442: done queuing things up, now waiting for results queue to drain 25201 1726882683.97444: waiting for pending results... 25201 1726882683.97758: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 25201 1726882683.97862: in run() - task 0e448fcc-3ce9-313b-197e-00000000000c 25201 1726882683.97889: variable 'ansible_search_path' from source: unknown 25201 1726882683.97933: calling self._execute() 25201 1726882683.98024: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882683.98035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882683.98047: variable 'omit' from source: magic vars 25201 1726882683.98442: variable 'ansible_distribution_major_version' from source: facts 25201 1726882683.98464: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882683.98476: _execute() done 25201 1726882683.98483: dumping result to json 25201 1726882683.98490: done dumping result, returning 25201 1726882683.98498: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-313b-197e-00000000000c] 25201 1726882683.98508: sending task result for task 0e448fcc-3ce9-313b-197e-00000000000c 25201 1726882683.98639: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000000c 25201 1726882683.98652: WORKER PROCESS EXITING 25201 1726882683.98691: no more pending results, returning what we have 25201 1726882683.98696: in VariableManager get_vars() 25201 1726882683.98744: Calling all_inventory to load vars for managed_node2 25201 1726882683.98747: Calling groups_inventory to load vars for managed_node2 25201 1726882683.98751: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882683.98766: Calling all_plugins_play to load vars for managed_node2 25201 1726882683.98769: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882683.98772: Calling groups_plugins_play to load vars for managed_node2 25201 1726882683.99012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882683.99247: done with get_vars() 25201 1726882683.99265: variable 'ansible_search_path' from source: unknown 25201 1726882683.99310: we have included files to process 25201 1726882683.99317: generating all_blocks data 25201 1726882683.99320: done generating all_blocks data 25201 1726882683.99329: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882683.99331: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882683.99339: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882683.99850: in VariableManager get_vars() 25201 1726882683.99869: done with get_vars() 25201 1726882684.00023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 25201 1726882684.00393: done processing included file 25201 1726882684.00395: iterating over new_blocks loaded from include file 25201 1726882684.00396: in VariableManager get_vars() 25201 1726882684.00408: done with get_vars() 25201 1726882684.00409: filtering new block on tags 25201 1726882684.00429: done filtering new block on tags 25201 1726882684.00430: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 25201 1726882684.00433: extending task lists for all hosts with included blocks 25201 1726882684.00527: done extending task lists 25201 1726882684.00528: done processing included files 25201 1726882684.00529: results queue empty 25201 1726882684.00529: checking for any_errors_fatal 25201 1726882684.00532: done checking for any_errors_fatal 25201 1726882684.00532: checking for max_fail_percentage 25201 1726882684.00533: done checking for max_fail_percentage 25201 1726882684.00533: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.00534: done checking to see if all hosts have failed 25201 1726882684.00534: getting the remaining hosts for this loop 25201 1726882684.00535: done getting the remaining hosts for this loop 25201 1726882684.00537: getting the next task for host managed_node2 25201 1726882684.00539: done getting next task for host managed_node2 25201 1726882684.00540: ^ task is: TASK: Ensure state in ["present", "absent"] 25201 1726882684.00542: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.00543: getting variables 25201 1726882684.00544: in VariableManager get_vars() 25201 1726882684.00551: Calling all_inventory to load vars for managed_node2 25201 1726882684.00553: Calling groups_inventory to load vars for managed_node2 25201 1726882684.00554: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.00557: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.00559: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.00560: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.00642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.00758: done with get_vars() 25201 1726882684.00767: done getting variables 25201 1726882684.00806: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:38:04 -0400 (0:00:00.036) 0:00:05.182 ****** 25201 1726882684.00824: entering _queue_task() for managed_node2/fail 25201 1726882684.00826: Creating lock for fail 25201 1726882684.01000: worker is 1 (out of 1 available) 25201 1726882684.01011: exiting _queue_task() for managed_node2/fail 25201 1726882684.01022: done queuing things up, now waiting for results queue to drain 25201 1726882684.01023: waiting for pending results... 25201 1726882684.01156: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 25201 1726882684.01217: in run() - task 0e448fcc-3ce9-313b-197e-000000000156 25201 1726882684.01226: variable 'ansible_search_path' from source: unknown 25201 1726882684.01230: variable 'ansible_search_path' from source: unknown 25201 1726882684.01255: calling self._execute() 25201 1726882684.01315: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.01318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.01327: variable 'omit' from source: magic vars 25201 1726882684.01749: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.01770: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.01921: variable 'state' from source: include params 25201 1726882684.01931: Evaluated conditional (state not in ["present", "absent"]): False 25201 1726882684.01939: when evaluation is False, skipping this task 25201 1726882684.01951: _execute() done 25201 1726882684.01965: dumping result to json 25201 1726882684.01974: done dumping result, returning 25201 1726882684.01982: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-313b-197e-000000000156] 25201 1726882684.01991: sending task result for task 0e448fcc-3ce9-313b-197e-000000000156 skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25201 1726882684.02127: no more pending results, returning what we have 25201 1726882684.02131: results queue empty 25201 1726882684.02132: checking for any_errors_fatal 25201 1726882684.02134: done checking for any_errors_fatal 25201 1726882684.02135: checking for max_fail_percentage 25201 1726882684.02136: done checking for max_fail_percentage 25201 1726882684.02137: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.02138: done checking to see if all hosts have failed 25201 1726882684.02138: getting the remaining hosts for this loop 25201 1726882684.02140: done getting the remaining hosts for this loop 25201 1726882684.02144: getting the next task for host managed_node2 25201 1726882684.02150: done getting next task for host managed_node2 25201 1726882684.02153: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25201 1726882684.02155: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.02159: getting variables 25201 1726882684.02160: in VariableManager get_vars() 25201 1726882684.02240: Calling all_inventory to load vars for managed_node2 25201 1726882684.02243: Calling groups_inventory to load vars for managed_node2 25201 1726882684.02246: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.02258: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.02261: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.02268: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.02460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.02796: done with get_vars() 25201 1726882684.02805: done getting variables 25201 1726882684.02851: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000156 25201 1726882684.02854: WORKER PROCESS EXITING 25201 1726882684.02960: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:38:04 -0400 (0:00:00.021) 0:00:05.204 ****** 25201 1726882684.02989: entering _queue_task() for managed_node2/fail 25201 1726882684.03197: worker is 1 (out of 1 available) 25201 1726882684.03208: exiting _queue_task() for managed_node2/fail 25201 1726882684.03218: done queuing things up, now waiting for results queue to drain 25201 1726882684.03219: waiting for pending results... 25201 1726882684.03457: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 25201 1726882684.03567: in run() - task 0e448fcc-3ce9-313b-197e-000000000157 25201 1726882684.03585: variable 'ansible_search_path' from source: unknown 25201 1726882684.03592: variable 'ansible_search_path' from source: unknown 25201 1726882684.03635: calling self._execute() 25201 1726882684.03722: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.03738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.03752: variable 'omit' from source: magic vars 25201 1726882684.04127: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.04145: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.04329: variable 'type' from source: play vars 25201 1726882684.04339: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25201 1726882684.04346: when evaluation is False, skipping this task 25201 1726882684.04353: _execute() done 25201 1726882684.04359: dumping result to json 25201 1726882684.04373: done dumping result, returning 25201 1726882684.04389: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-313b-197e-000000000157] 25201 1726882684.04398: sending task result for task 0e448fcc-3ce9-313b-197e-000000000157 25201 1726882684.04504: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000157 25201 1726882684.04511: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25201 1726882684.04572: no more pending results, returning what we have 25201 1726882684.04576: results queue empty 25201 1726882684.04577: checking for any_errors_fatal 25201 1726882684.04582: done checking for any_errors_fatal 25201 1726882684.04583: checking for max_fail_percentage 25201 1726882684.04585: done checking for max_fail_percentage 25201 1726882684.04585: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.04586: done checking to see if all hosts have failed 25201 1726882684.04587: getting the remaining hosts for this loop 25201 1726882684.04588: done getting the remaining hosts for this loop 25201 1726882684.04592: getting the next task for host managed_node2 25201 1726882684.04600: done getting next task for host managed_node2 25201 1726882684.04603: ^ task is: TASK: Include the task 'show_interfaces.yml' 25201 1726882684.04606: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.04610: getting variables 25201 1726882684.04611: in VariableManager get_vars() 25201 1726882684.04649: Calling all_inventory to load vars for managed_node2 25201 1726882684.04651: Calling groups_inventory to load vars for managed_node2 25201 1726882684.04654: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.04669: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.04673: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.04677: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.04853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.05090: done with get_vars() 25201 1726882684.05100: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:38:04 -0400 (0:00:00.023) 0:00:05.227 ****** 25201 1726882684.05324: entering _queue_task() for managed_node2/include_tasks 25201 1726882684.05583: worker is 1 (out of 1 available) 25201 1726882684.05593: exiting _queue_task() for managed_node2/include_tasks 25201 1726882684.05604: done queuing things up, now waiting for results queue to drain 25201 1726882684.05605: waiting for pending results... 25201 1726882684.05840: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25201 1726882684.05947: in run() - task 0e448fcc-3ce9-313b-197e-000000000158 25201 1726882684.05977: variable 'ansible_search_path' from source: unknown 25201 1726882684.05984: variable 'ansible_search_path' from source: unknown 25201 1726882684.06018: calling self._execute() 25201 1726882684.06166: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.06186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.06200: variable 'omit' from source: magic vars 25201 1726882684.06559: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.06580: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.06594: _execute() done 25201 1726882684.06601: dumping result to json 25201 1726882684.06607: done dumping result, returning 25201 1726882684.06625: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-313b-197e-000000000158] 25201 1726882684.06634: sending task result for task 0e448fcc-3ce9-313b-197e-000000000158 25201 1726882684.06747: no more pending results, returning what we have 25201 1726882684.06752: in VariableManager get_vars() 25201 1726882684.06845: Calling all_inventory to load vars for managed_node2 25201 1726882684.06848: Calling groups_inventory to load vars for managed_node2 25201 1726882684.06851: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.06867: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.06871: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.06874: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.07040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.07258: done with get_vars() 25201 1726882684.07269: variable 'ansible_search_path' from source: unknown 25201 1726882684.07270: variable 'ansible_search_path' from source: unknown 25201 1726882684.07313: we have included files to process 25201 1726882684.07315: generating all_blocks data 25201 1726882684.07317: done generating all_blocks data 25201 1726882684.07323: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882684.07324: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882684.07326: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882684.07553: in VariableManager get_vars() 25201 1726882684.07580: done with get_vars() 25201 1726882684.07729: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000158 25201 1726882684.07732: WORKER PROCESS EXITING 25201 1726882684.07819: done processing included file 25201 1726882684.07821: iterating over new_blocks loaded from include file 25201 1726882684.07822: in VariableManager get_vars() 25201 1726882684.07850: done with get_vars() 25201 1726882684.07852: filtering new block on tags 25201 1726882684.07874: done filtering new block on tags 25201 1726882684.07877: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25201 1726882684.07881: extending task lists for all hosts with included blocks 25201 1726882684.08312: done extending task lists 25201 1726882684.08313: done processing included files 25201 1726882684.08314: results queue empty 25201 1726882684.08315: checking for any_errors_fatal 25201 1726882684.08318: done checking for any_errors_fatal 25201 1726882684.08318: checking for max_fail_percentage 25201 1726882684.08319: done checking for max_fail_percentage 25201 1726882684.08320: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.08321: done checking to see if all hosts have failed 25201 1726882684.08322: getting the remaining hosts for this loop 25201 1726882684.08323: done getting the remaining hosts for this loop 25201 1726882684.08325: getting the next task for host managed_node2 25201 1726882684.08329: done getting next task for host managed_node2 25201 1726882684.08331: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25201 1726882684.08334: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.08336: getting variables 25201 1726882684.08337: in VariableManager get_vars() 25201 1726882684.08349: Calling all_inventory to load vars for managed_node2 25201 1726882684.08351: Calling groups_inventory to load vars for managed_node2 25201 1726882684.08353: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.08358: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.08360: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.08366: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.08543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.08720: done with get_vars() 25201 1726882684.08728: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:38:04 -0400 (0:00:00.034) 0:00:05.262 ****** 25201 1726882684.08793: entering _queue_task() for managed_node2/include_tasks 25201 1726882684.09021: worker is 1 (out of 1 available) 25201 1726882684.09033: exiting _queue_task() for managed_node2/include_tasks 25201 1726882684.09045: done queuing things up, now waiting for results queue to drain 25201 1726882684.09046: waiting for pending results... 25201 1726882684.09246: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25201 1726882684.09330: in run() - task 0e448fcc-3ce9-313b-197e-00000000017f 25201 1726882684.09342: variable 'ansible_search_path' from source: unknown 25201 1726882684.09345: variable 'ansible_search_path' from source: unknown 25201 1726882684.09378: calling self._execute() 25201 1726882684.09449: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.09453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.09467: variable 'omit' from source: magic vars 25201 1726882684.09798: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.09816: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.09826: _execute() done 25201 1726882684.09833: dumping result to json 25201 1726882684.09839: done dumping result, returning 25201 1726882684.09848: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-313b-197e-00000000017f] 25201 1726882684.09857: sending task result for task 0e448fcc-3ce9-313b-197e-00000000017f 25201 1726882684.09953: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000017f 25201 1726882684.09959: WORKER PROCESS EXITING 25201 1726882684.10017: no more pending results, returning what we have 25201 1726882684.10022: in VariableManager get_vars() 25201 1726882684.10066: Calling all_inventory to load vars for managed_node2 25201 1726882684.10070: Calling groups_inventory to load vars for managed_node2 25201 1726882684.10072: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.10084: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.10087: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.10090: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.10270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.10471: done with get_vars() 25201 1726882684.10478: variable 'ansible_search_path' from source: unknown 25201 1726882684.10479: variable 'ansible_search_path' from source: unknown 25201 1726882684.10537: we have included files to process 25201 1726882684.10538: generating all_blocks data 25201 1726882684.10541: done generating all_blocks data 25201 1726882684.10542: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882684.10543: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882684.10545: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882684.10941: done processing included file 25201 1726882684.10943: iterating over new_blocks loaded from include file 25201 1726882684.10944: in VariableManager get_vars() 25201 1726882684.10965: done with get_vars() 25201 1726882684.10967: filtering new block on tags 25201 1726882684.10985: done filtering new block on tags 25201 1726882684.10986: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25201 1726882684.10990: extending task lists for all hosts with included blocks 25201 1726882684.11135: done extending task lists 25201 1726882684.11136: done processing included files 25201 1726882684.11137: results queue empty 25201 1726882684.11138: checking for any_errors_fatal 25201 1726882684.11140: done checking for any_errors_fatal 25201 1726882684.11141: checking for max_fail_percentage 25201 1726882684.11142: done checking for max_fail_percentage 25201 1726882684.11143: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.11144: done checking to see if all hosts have failed 25201 1726882684.11144: getting the remaining hosts for this loop 25201 1726882684.11146: done getting the remaining hosts for this loop 25201 1726882684.11148: getting the next task for host managed_node2 25201 1726882684.11152: done getting next task for host managed_node2 25201 1726882684.11154: ^ task is: TASK: Gather current interface info 25201 1726882684.11158: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.11160: getting variables 25201 1726882684.11161: in VariableManager get_vars() 25201 1726882684.11175: Calling all_inventory to load vars for managed_node2 25201 1726882684.11177: Calling groups_inventory to load vars for managed_node2 25201 1726882684.11179: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.11184: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.11186: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.11189: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.11348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.11543: done with get_vars() 25201 1726882684.11552: done getting variables 25201 1726882684.11591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:38:04 -0400 (0:00:00.028) 0:00:05.290 ****** 25201 1726882684.11618: entering _queue_task() for managed_node2/command 25201 1726882684.11837: worker is 1 (out of 1 available) 25201 1726882684.11847: exiting _queue_task() for managed_node2/command 25201 1726882684.11858: done queuing things up, now waiting for results queue to drain 25201 1726882684.11859: waiting for pending results... 25201 1726882684.12086: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25201 1726882684.12197: in run() - task 0e448fcc-3ce9-313b-197e-0000000001b6 25201 1726882684.12217: variable 'ansible_search_path' from source: unknown 25201 1726882684.12223: variable 'ansible_search_path' from source: unknown 25201 1726882684.12256: calling self._execute() 25201 1726882684.12334: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.12346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.12359: variable 'omit' from source: magic vars 25201 1726882684.12696: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.12713: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.12724: variable 'omit' from source: magic vars 25201 1726882684.12780: variable 'omit' from source: magic vars 25201 1726882684.12817: variable 'omit' from source: magic vars 25201 1726882684.12862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882684.12902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882684.13040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882684.13065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.13084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.13116: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882684.13124: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.13131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.13232: Set connection var ansible_shell_executable to /bin/sh 25201 1726882684.13242: Set connection var ansible_pipelining to False 25201 1726882684.13254: Set connection var ansible_connection to ssh 25201 1726882684.13262: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882684.13271: Set connection var ansible_shell_type to sh 25201 1726882684.13281: Set connection var ansible_timeout to 10 25201 1726882684.13305: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.13314: variable 'ansible_connection' from source: unknown 25201 1726882684.13321: variable 'ansible_module_compression' from source: unknown 25201 1726882684.13328: variable 'ansible_shell_type' from source: unknown 25201 1726882684.13335: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.13341: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.13349: variable 'ansible_pipelining' from source: unknown 25201 1726882684.13360: variable 'ansible_timeout' from source: unknown 25201 1726882684.13372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.13510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882684.13527: variable 'omit' from source: magic vars 25201 1726882684.13536: starting attempt loop 25201 1726882684.13543: running the handler 25201 1726882684.13561: _low_level_execute_command(): starting 25201 1726882684.13691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882684.15392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.15437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.15453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.15482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.15528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.15546: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.15579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.15605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.15618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.15650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.15680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.15985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.16060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.16157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882684.18066: stdout chunk (state=3): >>>/root <<< 25201 1726882684.18177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.18239: stderr chunk (state=3): >>><<< 25201 1726882684.18242: stdout chunk (state=3): >>><<< 25201 1726882684.18269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882684.18353: _low_level_execute_command(): starting 25201 1726882684.18356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628 `" && echo ansible-tmp-1726882684.1826062-25456-33082631800628="` echo /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628 `" ) && sleep 0' 25201 1726882684.19071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.19087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.19103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.19130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.19174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.19191: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.19206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.19224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.19235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.19246: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.19258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.19277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.19291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.19309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.19321: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.19336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.19418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.19480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.19498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.19659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25201 1726882684.22149: stdout chunk (state=3): >>>ansible-tmp-1726882684.1826062-25456-33082631800628=/root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628 <<< 25201 1726882684.22370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.22373: stdout chunk (state=3): >>><<< 25201 1726882684.22375: stderr chunk (state=3): >>><<< 25201 1726882684.22378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882684.1826062-25456-33082631800628=/root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25201 1726882684.22383: variable 'ansible_module_compression' from source: unknown 25201 1726882684.22569: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882684.22572: variable 'ansible_facts' from source: unknown 25201 1726882684.22574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/AnsiballZ_command.py 25201 1726882684.23603: Sending initial data 25201 1726882684.23606: Sent initial data (155 bytes) 25201 1726882684.26012: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.26016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.26109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.26115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.26195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.26202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.26291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.26294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.26384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.26519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.28238: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882684.28330: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882684.28429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmppqg4kyc_ /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/AnsiballZ_command.py <<< 25201 1726882684.28522: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882684.30211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.30270: stderr chunk (state=3): >>><<< 25201 1726882684.30274: stdout chunk (state=3): >>><<< 25201 1726882684.30276: done transferring module to remote 25201 1726882684.30278: _low_level_execute_command(): starting 25201 1726882684.30281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/ /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/AnsiballZ_command.py && sleep 0' 25201 1726882684.31745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.31749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.31792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.31799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.31802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882684.31804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.31968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.31974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.31977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.32083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.33826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.33895: stderr chunk (state=3): >>><<< 25201 1726882684.33898: stdout chunk (state=3): >>><<< 25201 1726882684.33993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882684.33996: _low_level_execute_command(): starting 25201 1726882684.33998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/AnsiballZ_command.py && sleep 0' 25201 1726882684.34957: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.34960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.34996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882684.34999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.35002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.35150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.35170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.35182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.35315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.48796: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:04.482605", "end": "2024-09-20 21:38:04.485996", "delta": "0:00:00.003391", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882684.50086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882684.50185: stderr chunk (state=3): >>><<< 25201 1726882684.50189: stdout chunk (state=3): >>><<< 25201 1726882684.50336: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:04.482605", "end": "2024-09-20 21:38:04.485996", "delta": "0:00:00.003391", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882684.50340: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882684.50343: _low_level_execute_command(): starting 25201 1726882684.50345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882684.1826062-25456-33082631800628/ > /dev/null 2>&1 && sleep 0' 25201 1726882684.50947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.50960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.50983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.51001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.51045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.51057: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.51075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.51092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.51102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.51118: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.51129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.51141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.51154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.51169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.51181: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.51194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.51278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.51299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.51314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.51445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.53381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.53385: stdout chunk (state=3): >>><<< 25201 1726882684.53392: stderr chunk (state=3): >>><<< 25201 1726882684.53414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882684.53420: handler run complete 25201 1726882684.53445: Evaluated conditional (False): False 25201 1726882684.53456: attempt loop complete, returning result 25201 1726882684.53459: _execute() done 25201 1726882684.53461: dumping result to json 25201 1726882684.53468: done dumping result, returning 25201 1726882684.53480: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-313b-197e-0000000001b6] 25201 1726882684.53485: sending task result for task 0e448fcc-3ce9-313b-197e-0000000001b6 25201 1726882684.53595: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000001b6 25201 1726882684.53598: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003391", "end": "2024-09-20 21:38:04.485996", "rc": 0, "start": "2024-09-20 21:38:04.482605" } STDOUT: bonding_masters eth0 lo 25201 1726882684.53835: no more pending results, returning what we have 25201 1726882684.53837: results queue empty 25201 1726882684.53838: checking for any_errors_fatal 25201 1726882684.53840: done checking for any_errors_fatal 25201 1726882684.53840: checking for max_fail_percentage 25201 1726882684.53842: done checking for max_fail_percentage 25201 1726882684.53843: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.53844: done checking to see if all hosts have failed 25201 1726882684.53844: getting the remaining hosts for this loop 25201 1726882684.53846: done getting the remaining hosts for this loop 25201 1726882684.53849: getting the next task for host managed_node2 25201 1726882684.53855: done getting next task for host managed_node2 25201 1726882684.53857: ^ task is: TASK: Set current_interfaces 25201 1726882684.53862: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.53867: getting variables 25201 1726882684.53868: in VariableManager get_vars() 25201 1726882684.53903: Calling all_inventory to load vars for managed_node2 25201 1726882684.53905: Calling groups_inventory to load vars for managed_node2 25201 1726882684.53907: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.53916: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.53919: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.53921: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.54083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.54321: done with get_vars() 25201 1726882684.54331: done getting variables 25201 1726882684.54395: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:38:04 -0400 (0:00:00.428) 0:00:05.718 ****** 25201 1726882684.54425: entering _queue_task() for managed_node2/set_fact 25201 1726882684.54654: worker is 1 (out of 1 available) 25201 1726882684.54669: exiting _queue_task() for managed_node2/set_fact 25201 1726882684.54682: done queuing things up, now waiting for results queue to drain 25201 1726882684.54684: waiting for pending results... 25201 1726882684.54970: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25201 1726882684.55087: in run() - task 0e448fcc-3ce9-313b-197e-0000000001b7 25201 1726882684.55099: variable 'ansible_search_path' from source: unknown 25201 1726882684.55102: variable 'ansible_search_path' from source: unknown 25201 1726882684.55140: calling self._execute() 25201 1726882684.55669: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.55686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.55700: variable 'omit' from source: magic vars 25201 1726882684.56097: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.56113: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.56119: variable 'omit' from source: magic vars 25201 1726882684.56179: variable 'omit' from source: magic vars 25201 1726882684.56308: variable '_current_interfaces' from source: set_fact 25201 1726882684.56390: variable 'omit' from source: magic vars 25201 1726882684.56441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882684.56490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882684.56509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882684.56527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.56538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.56586: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882684.56590: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.56609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.56713: Set connection var ansible_shell_executable to /bin/sh 25201 1726882684.56717: Set connection var ansible_pipelining to False 25201 1726882684.56723: Set connection var ansible_connection to ssh 25201 1726882684.56729: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882684.56732: Set connection var ansible_shell_type to sh 25201 1726882684.56739: Set connection var ansible_timeout to 10 25201 1726882684.56761: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.56768: variable 'ansible_connection' from source: unknown 25201 1726882684.56771: variable 'ansible_module_compression' from source: unknown 25201 1726882684.56774: variable 'ansible_shell_type' from source: unknown 25201 1726882684.56776: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.56778: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.56787: variable 'ansible_pipelining' from source: unknown 25201 1726882684.56790: variable 'ansible_timeout' from source: unknown 25201 1726882684.56794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.56970: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882684.56978: variable 'omit' from source: magic vars 25201 1726882684.56983: starting attempt loop 25201 1726882684.56989: running the handler 25201 1726882684.57080: handler run complete 25201 1726882684.57091: attempt loop complete, returning result 25201 1726882684.57093: _execute() done 25201 1726882684.57098: dumping result to json 25201 1726882684.57102: done dumping result, returning 25201 1726882684.57332: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-313b-197e-0000000001b7] 25201 1726882684.57342: sending task result for task 0e448fcc-3ce9-313b-197e-0000000001b7 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25201 1726882684.57486: no more pending results, returning what we have 25201 1726882684.57488: results queue empty 25201 1726882684.57490: checking for any_errors_fatal 25201 1726882684.57496: done checking for any_errors_fatal 25201 1726882684.57497: checking for max_fail_percentage 25201 1726882684.57499: done checking for max_fail_percentage 25201 1726882684.57500: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.57501: done checking to see if all hosts have failed 25201 1726882684.57501: getting the remaining hosts for this loop 25201 1726882684.57503: done getting the remaining hosts for this loop 25201 1726882684.57506: getting the next task for host managed_node2 25201 1726882684.57515: done getting next task for host managed_node2 25201 1726882684.57518: ^ task is: TASK: Show current_interfaces 25201 1726882684.57522: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.57525: getting variables 25201 1726882684.57526: in VariableManager get_vars() 25201 1726882684.57567: Calling all_inventory to load vars for managed_node2 25201 1726882684.57570: Calling groups_inventory to load vars for managed_node2 25201 1726882684.57572: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.57582: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.57585: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.57588: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.57958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.58165: done with get_vars() 25201 1726882684.58174: done getting variables 25201 1726882684.58227: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:38:04 -0400 (0:00:00.038) 0:00:05.757 ****** 25201 1726882684.58259: entering _queue_task() for managed_node2/debug 25201 1726882684.58281: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000001b7 25201 1726882684.58289: WORKER PROCESS EXITING 25201 1726882684.58687: worker is 1 (out of 1 available) 25201 1726882684.58699: exiting _queue_task() for managed_node2/debug 25201 1726882684.58710: done queuing things up, now waiting for results queue to drain 25201 1726882684.58712: waiting for pending results... 25201 1726882684.58944: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25201 1726882684.59047: in run() - task 0e448fcc-3ce9-313b-197e-000000000180 25201 1726882684.59073: variable 'ansible_search_path' from source: unknown 25201 1726882684.59080: variable 'ansible_search_path' from source: unknown 25201 1726882684.59115: calling self._execute() 25201 1726882684.59196: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.59207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.59220: variable 'omit' from source: magic vars 25201 1726882684.59579: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.59595: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.59608: variable 'omit' from source: magic vars 25201 1726882684.59652: variable 'omit' from source: magic vars 25201 1726882684.59754: variable 'current_interfaces' from source: set_fact 25201 1726882684.59787: variable 'omit' from source: magic vars 25201 1726882684.59830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882684.59872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882684.59895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882684.59917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.59936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.59970: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882684.59979: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.59986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.60091: Set connection var ansible_shell_executable to /bin/sh 25201 1726882684.60101: Set connection var ansible_pipelining to False 25201 1726882684.60110: Set connection var ansible_connection to ssh 25201 1726882684.60119: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882684.60124: Set connection var ansible_shell_type to sh 25201 1726882684.60135: Set connection var ansible_timeout to 10 25201 1726882684.60167: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.60175: variable 'ansible_connection' from source: unknown 25201 1726882684.60182: variable 'ansible_module_compression' from source: unknown 25201 1726882684.60188: variable 'ansible_shell_type' from source: unknown 25201 1726882684.60194: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.60199: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.60206: variable 'ansible_pipelining' from source: unknown 25201 1726882684.60212: variable 'ansible_timeout' from source: unknown 25201 1726882684.60219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.60351: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882684.60375: variable 'omit' from source: magic vars 25201 1726882684.60385: starting attempt loop 25201 1726882684.60393: running the handler 25201 1726882684.60439: handler run complete 25201 1726882684.60458: attempt loop complete, returning result 25201 1726882684.60473: _execute() done 25201 1726882684.60480: dumping result to json 25201 1726882684.60487: done dumping result, returning 25201 1726882684.60497: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-313b-197e-000000000180] 25201 1726882684.60506: sending task result for task 0e448fcc-3ce9-313b-197e-000000000180 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25201 1726882684.60633: no more pending results, returning what we have 25201 1726882684.60636: results queue empty 25201 1726882684.60637: checking for any_errors_fatal 25201 1726882684.60643: done checking for any_errors_fatal 25201 1726882684.60644: checking for max_fail_percentage 25201 1726882684.60645: done checking for max_fail_percentage 25201 1726882684.60646: checking to see if all hosts have failed and the running result is not ok 25201 1726882684.60647: done checking to see if all hosts have failed 25201 1726882684.60647: getting the remaining hosts for this loop 25201 1726882684.60648: done getting the remaining hosts for this loop 25201 1726882684.60651: getting the next task for host managed_node2 25201 1726882684.60659: done getting next task for host managed_node2 25201 1726882684.60665: ^ task is: TASK: Install iproute 25201 1726882684.60669: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882684.60676: getting variables 25201 1726882684.60678: in VariableManager get_vars() 25201 1726882684.60715: Calling all_inventory to load vars for managed_node2 25201 1726882684.60717: Calling groups_inventory to load vars for managed_node2 25201 1726882684.60720: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882684.60729: Calling all_plugins_play to load vars for managed_node2 25201 1726882684.60731: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882684.60733: Calling groups_plugins_play to load vars for managed_node2 25201 1726882684.60909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882684.61127: done with get_vars() 25201 1726882684.61137: done getting variables 25201 1726882684.61194: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:38:04 -0400 (0:00:00.029) 0:00:05.786 ****** 25201 1726882684.61228: entering _queue_task() for managed_node2/package 25201 1726882684.61245: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000180 25201 1726882684.61254: WORKER PROCESS EXITING 25201 1726882684.61646: worker is 1 (out of 1 available) 25201 1726882684.61657: exiting _queue_task() for managed_node2/package 25201 1726882684.61672: done queuing things up, now waiting for results queue to drain 25201 1726882684.61674: waiting for pending results... 25201 1726882684.61979: running TaskExecutor() for managed_node2/TASK: Install iproute 25201 1726882684.62082: in run() - task 0e448fcc-3ce9-313b-197e-000000000159 25201 1726882684.62099: variable 'ansible_search_path' from source: unknown 25201 1726882684.62107: variable 'ansible_search_path' from source: unknown 25201 1726882684.62148: calling self._execute() 25201 1726882684.62238: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.62249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.62268: variable 'omit' from source: magic vars 25201 1726882684.63017: variable 'ansible_distribution_major_version' from source: facts 25201 1726882684.63112: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882684.63124: variable 'omit' from source: magic vars 25201 1726882684.63166: variable 'omit' from source: magic vars 25201 1726882684.63438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882684.65696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882684.65780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882684.65818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882684.65852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882684.65885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882684.65982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882684.66015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882684.66050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882684.66100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882684.66120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882684.66232: variable '__network_is_ostree' from source: set_fact 25201 1726882684.66247: variable 'omit' from source: magic vars 25201 1726882684.66286: variable 'omit' from source: magic vars 25201 1726882684.66320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882684.66357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882684.66385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882684.66407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.66422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882684.66454: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882684.66471: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.66480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.66594: Set connection var ansible_shell_executable to /bin/sh 25201 1726882684.66604: Set connection var ansible_pipelining to False 25201 1726882684.66613: Set connection var ansible_connection to ssh 25201 1726882684.66623: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882684.66630: Set connection var ansible_shell_type to sh 25201 1726882684.66642: Set connection var ansible_timeout to 10 25201 1726882684.66673: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.66685: variable 'ansible_connection' from source: unknown 25201 1726882684.66693: variable 'ansible_module_compression' from source: unknown 25201 1726882684.66700: variable 'ansible_shell_type' from source: unknown 25201 1726882684.66713: variable 'ansible_shell_executable' from source: unknown 25201 1726882684.66722: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882684.66729: variable 'ansible_pipelining' from source: unknown 25201 1726882684.66735: variable 'ansible_timeout' from source: unknown 25201 1726882684.66741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882684.66839: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882684.66853: variable 'omit' from source: magic vars 25201 1726882684.66860: starting attempt loop 25201 1726882684.66869: running the handler 25201 1726882684.66882: variable 'ansible_facts' from source: unknown 25201 1726882684.66888: variable 'ansible_facts' from source: unknown 25201 1726882684.66923: _low_level_execute_command(): starting 25201 1726882684.66933: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882684.67730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.67743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.67759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.67785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.67824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.67834: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.67845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.67860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.67881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.67890: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.67900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.67911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.67923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.67932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.67940: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.67951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.68029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.68057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.68075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.68217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.69891: stdout chunk (state=3): >>>/root <<< 25201 1726882684.70060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.70067: stdout chunk (state=3): >>><<< 25201 1726882684.70079: stderr chunk (state=3): >>><<< 25201 1726882684.70102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882684.70113: _low_level_execute_command(): starting 25201 1726882684.70119: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604 `" && echo ansible-tmp-1726882684.7010157-25488-171074447101604="` echo /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604 `" ) && sleep 0' 25201 1726882684.70745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.70756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.70777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.70791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.70829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.70836: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.70847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.70860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.70869: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.70877: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.70885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.70894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.70906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.70914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.70920: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.70928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.70992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.71011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.71023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.71149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.73019: stdout chunk (state=3): >>>ansible-tmp-1726882684.7010157-25488-171074447101604=/root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604 <<< 25201 1726882684.73126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.73193: stderr chunk (state=3): >>><<< 25201 1726882684.73196: stdout chunk (state=3): >>><<< 25201 1726882684.73227: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882684.7010157-25488-171074447101604=/root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882684.73254: variable 'ansible_module_compression' from source: unknown 25201 1726882684.73317: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 25201 1726882684.73321: ANSIBALLZ: Acquiring lock 25201 1726882684.73323: ANSIBALLZ: Lock acquired: 140300039193808 25201 1726882684.73325: ANSIBALLZ: Creating module 25201 1726882684.88899: ANSIBALLZ: Writing module into payload 25201 1726882684.89166: ANSIBALLZ: Writing module 25201 1726882684.89191: ANSIBALLZ: Renaming module 25201 1726882684.89203: ANSIBALLZ: Done creating module 25201 1726882684.89220: variable 'ansible_facts' from source: unknown 25201 1726882684.89299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/AnsiballZ_dnf.py 25201 1726882684.89445: Sending initial data 25201 1726882684.89449: Sent initial data (152 bytes) 25201 1726882684.90421: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.90432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.90443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.90457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.90502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.90509: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.90519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.90531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.90539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.90545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.90551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.90566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.90575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.90582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.90589: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.90598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.90674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.90694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.90706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.90838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.92686: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25201 1726882684.92690: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882684.92793: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882684.92939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpbwitnflx /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/AnsiballZ_dnf.py <<< 25201 1726882684.92995: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882684.95140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.95252: stderr chunk (state=3): >>><<< 25201 1726882684.95256: stdout chunk (state=3): >>><<< 25201 1726882684.95258: done transferring module to remote 25201 1726882684.95260: _low_level_execute_command(): starting 25201 1726882684.95267: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/ /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/AnsiballZ_dnf.py && sleep 0' 25201 1726882684.95923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.95941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.95944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.95969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.95973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.95975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.96046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.96049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.96158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882684.98062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882684.98067: stdout chunk (state=3): >>><<< 25201 1726882684.98069: stderr chunk (state=3): >>><<< 25201 1726882684.98156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882684.98160: _low_level_execute_command(): starting 25201 1726882684.98162: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/AnsiballZ_dnf.py && sleep 0' 25201 1726882684.98700: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882684.98714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.98728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.98744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.98789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.98802: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882684.98816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.98835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882684.98847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882684.98860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882684.98876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882684.98890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882684.98907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882684.98920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882684.98931: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882684.98945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882684.99018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882684.99040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882684.99067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882684.99203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.00568: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25201 1726882686.06187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882686.06283: stderr chunk (state=3): >>><<< 25201 1726882686.06287: stdout chunk (state=3): >>><<< 25201 1726882686.06370: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882686.06374: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882686.06377: _low_level_execute_command(): starting 25201 1726882686.06379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882684.7010157-25488-171074447101604/ > /dev/null 2>&1 && sleep 0' 25201 1726882686.07012: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.07025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.07040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.07057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.07103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.07116: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.07129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.07145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.07156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.07170: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.07182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.07194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.07208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.07219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.07231: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.07243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.07323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.07346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.07360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.07501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.09403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.09407: stdout chunk (state=3): >>><<< 25201 1726882686.09409: stderr chunk (state=3): >>><<< 25201 1726882686.09470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.09474: handler run complete 25201 1726882686.09773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882686.09802: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882686.09843: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882686.09890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882686.09923: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882686.10004: variable '__install_status' from source: unknown 25201 1726882686.10026: Evaluated conditional (__install_status is success): True 25201 1726882686.10047: attempt loop complete, returning result 25201 1726882686.10054: _execute() done 25201 1726882686.10060: dumping result to json 25201 1726882686.10073: done dumping result, returning 25201 1726882686.10084: done running TaskExecutor() for managed_node2/TASK: Install iproute [0e448fcc-3ce9-313b-197e-000000000159] 25201 1726882686.10093: sending task result for task 0e448fcc-3ce9-313b-197e-000000000159 ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25201 1726882686.10368: no more pending results, returning what we have 25201 1726882686.10372: results queue empty 25201 1726882686.10373: checking for any_errors_fatal 25201 1726882686.10377: done checking for any_errors_fatal 25201 1726882686.10378: checking for max_fail_percentage 25201 1726882686.10380: done checking for max_fail_percentage 25201 1726882686.10380: checking to see if all hosts have failed and the running result is not ok 25201 1726882686.10381: done checking to see if all hosts have failed 25201 1726882686.10382: getting the remaining hosts for this loop 25201 1726882686.10383: done getting the remaining hosts for this loop 25201 1726882686.10387: getting the next task for host managed_node2 25201 1726882686.10393: done getting next task for host managed_node2 25201 1726882686.10396: ^ task is: TASK: Create veth interface {{ interface }} 25201 1726882686.10399: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882686.10401: getting variables 25201 1726882686.10403: in VariableManager get_vars() 25201 1726882686.10434: Calling all_inventory to load vars for managed_node2 25201 1726882686.10436: Calling groups_inventory to load vars for managed_node2 25201 1726882686.10439: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882686.10449: Calling all_plugins_play to load vars for managed_node2 25201 1726882686.10451: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882686.10454: Calling groups_plugins_play to load vars for managed_node2 25201 1726882686.10623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882686.10846: done with get_vars() 25201 1726882686.10856: done getting variables 25201 1726882686.11033: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000159 25201 1726882686.11036: WORKER PROCESS EXITING 25201 1726882686.11072: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882686.11308: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:38:06 -0400 (0:00:01.502) 0:00:07.289 ****** 25201 1726882686.11451: entering _queue_task() for managed_node2/command 25201 1726882686.11688: worker is 1 (out of 1 available) 25201 1726882686.11700: exiting _queue_task() for managed_node2/command 25201 1726882686.11711: done queuing things up, now waiting for results queue to drain 25201 1726882686.11713: waiting for pending results... 25201 1726882686.11973: running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 25201 1726882686.12085: in run() - task 0e448fcc-3ce9-313b-197e-00000000015a 25201 1726882686.12109: variable 'ansible_search_path' from source: unknown 25201 1726882686.12117: variable 'ansible_search_path' from source: unknown 25201 1726882686.12391: variable 'interface' from source: play vars 25201 1726882686.12492: variable 'interface' from source: play vars 25201 1726882686.12579: variable 'interface' from source: play vars 25201 1726882686.12742: Loaded config def from plugin (lookup/items) 25201 1726882686.12759: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25201 1726882686.12790: variable 'omit' from source: magic vars 25201 1726882686.12927: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.12943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.12959: variable 'omit' from source: magic vars 25201 1726882686.13213: variable 'ansible_distribution_major_version' from source: facts 25201 1726882686.13226: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882686.13446: variable 'type' from source: play vars 25201 1726882686.13467: variable 'state' from source: include params 25201 1726882686.13479: variable 'interface' from source: play vars 25201 1726882686.13489: variable 'current_interfaces' from source: set_fact 25201 1726882686.13501: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25201 1726882686.13516: variable 'omit' from source: magic vars 25201 1726882686.13554: variable 'omit' from source: magic vars 25201 1726882686.13610: variable 'item' from source: unknown 25201 1726882686.13692: variable 'item' from source: unknown 25201 1726882686.13712: variable 'omit' from source: magic vars 25201 1726882686.13752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882686.13796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882686.13819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882686.13886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.13907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.13939: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882686.13954: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.13965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.14077: Set connection var ansible_shell_executable to /bin/sh 25201 1726882686.14088: Set connection var ansible_pipelining to False 25201 1726882686.14098: Set connection var ansible_connection to ssh 25201 1726882686.14108: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882686.14120: Set connection var ansible_shell_type to sh 25201 1726882686.14132: Set connection var ansible_timeout to 10 25201 1726882686.14156: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.14172: variable 'ansible_connection' from source: unknown 25201 1726882686.14182: variable 'ansible_module_compression' from source: unknown 25201 1726882686.14188: variable 'ansible_shell_type' from source: unknown 25201 1726882686.14195: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.14203: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.14210: variable 'ansible_pipelining' from source: unknown 25201 1726882686.14216: variable 'ansible_timeout' from source: unknown 25201 1726882686.14228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.14371: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882686.14393: variable 'omit' from source: magic vars 25201 1726882686.14402: starting attempt loop 25201 1726882686.14409: running the handler 25201 1726882686.14427: _low_level_execute_command(): starting 25201 1726882686.14443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882686.15234: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.15247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.15268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.15285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.15328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.15339: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.15350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.15375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.15386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.15395: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.15405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.15423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.15438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.15449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.15458: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.15476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.15557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.15585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.15604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.15735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.17346: stdout chunk (state=3): >>>/root <<< 25201 1726882686.17452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.17521: stderr chunk (state=3): >>><<< 25201 1726882686.17527: stdout chunk (state=3): >>><<< 25201 1726882686.17551: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.17568: _low_level_execute_command(): starting 25201 1726882686.17573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327 `" && echo ansible-tmp-1726882686.1755102-25546-51453595000327="` echo /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327 `" ) && sleep 0' 25201 1726882686.18159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.18170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.18179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.18193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.18228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.18235: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.18244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.18256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.18267: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.18270: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.18282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.18288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.18299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.18306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.18312: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.18321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.18394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.18410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.18421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.18543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.20460: stdout chunk (state=3): >>>ansible-tmp-1726882686.1755102-25546-51453595000327=/root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327 <<< 25201 1726882686.20582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.20605: stderr chunk (state=3): >>><<< 25201 1726882686.20608: stdout chunk (state=3): >>><<< 25201 1726882686.20629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882686.1755102-25546-51453595000327=/root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.20661: variable 'ansible_module_compression' from source: unknown 25201 1726882686.20715: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882686.20746: variable 'ansible_facts' from source: unknown 25201 1726882686.20829: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/AnsiballZ_command.py 25201 1726882686.21349: Sending initial data 25201 1726882686.21353: Sent initial data (155 bytes) 25201 1726882686.22561: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.22569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.22583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.22618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.22623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882686.22635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.22641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.22646: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.22658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.22736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.22744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.22755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.22881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.24623: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882686.24722: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882686.24820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp_11u5gt4 /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/AnsiballZ_command.py <<< 25201 1726882686.24920: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882686.26833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.26972: stderr chunk (state=3): >>><<< 25201 1726882686.26975: stdout chunk (state=3): >>><<< 25201 1726882686.26996: done transferring module to remote 25201 1726882686.27008: _low_level_execute_command(): starting 25201 1726882686.27011: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/ /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/AnsiballZ_command.py && sleep 0' 25201 1726882686.28444: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.29081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.29090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.29104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.29143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.29150: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.29160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.29178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.29186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.29192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.29200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.29209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.29219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.29226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.29232: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.29241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.29320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.29338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.29349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.29480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.31346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.31350: stdout chunk (state=3): >>><<< 25201 1726882686.31357: stderr chunk (state=3): >>><<< 25201 1726882686.31382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.31385: _low_level_execute_command(): starting 25201 1726882686.31388: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/AnsiballZ_command.py && sleep 0' 25201 1726882686.33230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.33344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.33347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.33483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882686.33489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882686.33503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.33508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882686.33521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.33700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.33706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.33719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.33883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.47881: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:38:06.466637", "end": "2024-09-20 21:38:06.476445", "delta": "0:00:00.009808", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882686.50450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882686.50455: stdout chunk (state=3): >>><<< 25201 1726882686.50457: stderr chunk (state=3): >>><<< 25201 1726882686.50629: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:38:06.466637", "end": "2024-09-20 21:38:06.476445", "delta": "0:00:00.009808", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882686.50637: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882686.50640: _low_level_execute_command(): starting 25201 1726882686.50642: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882686.1755102-25546-51453595000327/ > /dev/null 2>&1 && sleep 0' 25201 1726882686.52181: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 25201 1726882686.52223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.52229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.52341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.54749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.54818: stderr chunk (state=3): >>><<< 25201 1726882686.54821: stdout chunk (state=3): >>><<< 25201 1726882686.55172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.55176: handler run complete 25201 1726882686.55178: Evaluated conditional (False): False 25201 1726882686.55180: attempt loop complete, returning result 25201 1726882686.55182: variable 'item' from source: unknown 25201 1726882686.55184: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.009808", "end": "2024-09-20 21:38:06.476445", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:38:06.466637" } 25201 1726882686.55337: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.55340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.55343: variable 'omit' from source: magic vars 25201 1726882686.55469: variable 'ansible_distribution_major_version' from source: facts 25201 1726882686.55482: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882686.55733: variable 'type' from source: play vars 25201 1726882686.55897: variable 'state' from source: include params 25201 1726882686.55906: variable 'interface' from source: play vars 25201 1726882686.55915: variable 'current_interfaces' from source: set_fact 25201 1726882686.55925: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25201 1726882686.55933: variable 'omit' from source: magic vars 25201 1726882686.55952: variable 'omit' from source: magic vars 25201 1726882686.56004: variable 'item' from source: unknown 25201 1726882686.56175: variable 'item' from source: unknown 25201 1726882686.56194: variable 'omit' from source: magic vars 25201 1726882686.56237: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882686.56334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.56346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.56381: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882686.56390: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.56397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.56609: Set connection var ansible_shell_executable to /bin/sh 25201 1726882686.56620: Set connection var ansible_pipelining to False 25201 1726882686.56629: Set connection var ansible_connection to ssh 25201 1726882686.56638: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882686.56650: Set connection var ansible_shell_type to sh 25201 1726882686.56666: Set connection var ansible_timeout to 10 25201 1726882686.56782: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.56790: variable 'ansible_connection' from source: unknown 25201 1726882686.56804: variable 'ansible_module_compression' from source: unknown 25201 1726882686.56812: variable 'ansible_shell_type' from source: unknown 25201 1726882686.56819: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.56826: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.56834: variable 'ansible_pipelining' from source: unknown 25201 1726882686.56848: variable 'ansible_timeout' from source: unknown 25201 1726882686.56857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.57098: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882686.57112: variable 'omit' from source: magic vars 25201 1726882686.57170: starting attempt loop 25201 1726882686.57178: running the handler 25201 1726882686.57194: _low_level_execute_command(): starting 25201 1726882686.57246: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882686.58993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.58996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.59038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.59042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.59044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882686.59154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.59212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.59376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.59494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.61079: stdout chunk (state=3): >>>/root <<< 25201 1726882686.61253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.61256: stdout chunk (state=3): >>><<< 25201 1726882686.61268: stderr chunk (state=3): >>><<< 25201 1726882686.61283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.61289: _low_level_execute_command(): starting 25201 1726882686.61294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081 `" && echo ansible-tmp-1726882686.6127977-25546-117104373950081="` echo /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081 `" ) && sleep 0' 25201 1726882686.61887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.61896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.61905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.61919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.61955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.61966: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.61975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.61989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.61995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.62002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.62009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.62018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.62030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.62041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.62048: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.62058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.62128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.62145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.62157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.62288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.64158: stdout chunk (state=3): >>>ansible-tmp-1726882686.6127977-25546-117104373950081=/root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081 <<< 25201 1726882686.64312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.64315: stdout chunk (state=3): >>><<< 25201 1726882686.64324: stderr chunk (state=3): >>><<< 25201 1726882686.64335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882686.6127977-25546-117104373950081=/root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.64362: variable 'ansible_module_compression' from source: unknown 25201 1726882686.64402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882686.64422: variable 'ansible_facts' from source: unknown 25201 1726882686.64487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/AnsiballZ_command.py 25201 1726882686.64613: Sending initial data 25201 1726882686.64616: Sent initial data (156 bytes) 25201 1726882686.65500: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.65510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.65520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.65534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.65573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.65580: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.65590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.65604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.65611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.65618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.65626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.65635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.65646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.65653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.65660: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.65671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.65738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.65755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.65769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.65909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.67609: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882686.67706: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882686.67810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp3uaqjnda /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/AnsiballZ_command.py <<< 25201 1726882686.67900: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882686.69461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.69679: stderr chunk (state=3): >>><<< 25201 1726882686.69682: stdout chunk (state=3): >>><<< 25201 1726882686.69684: done transferring module to remote 25201 1726882686.69687: _low_level_execute_command(): starting 25201 1726882686.69689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/ /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/AnsiballZ_command.py && sleep 0' 25201 1726882686.70256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.70274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.70289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.70305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.70350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.70361: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.70380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.70396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.70406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.70416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.70427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.70445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.70459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.70476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.70486: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.70498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.70583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.70605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.70621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.70747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.72573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.72576: stdout chunk (state=3): >>><<< 25201 1726882686.72579: stderr chunk (state=3): >>><<< 25201 1726882686.72639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.72643: _low_level_execute_command(): starting 25201 1726882686.72646: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/AnsiballZ_command.py && sleep 0' 25201 1726882686.73387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.73401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.73418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.73434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.73478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.73495: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.73509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.73527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.73538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.73547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.73558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.73574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.73617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.73628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.73638: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.73650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.73742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.73766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.73782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.73916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.87345: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:38:06.867558", "end": "2024-09-20 21:38:06.871402", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882686.88594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882686.88598: stdout chunk (state=3): >>><<< 25201 1726882686.88601: stderr chunk (state=3): >>><<< 25201 1726882686.88726: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:38:06.867558", "end": "2024-09-20 21:38:06.871402", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882686.88729: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882686.88732: _low_level_execute_command(): starting 25201 1726882686.88734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882686.6127977-25546-117104373950081/ > /dev/null 2>&1 && sleep 0' 25201 1726882686.89341: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.89356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.89374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.89400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.89444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.89456: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.89474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.89493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.89511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.89529: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.89540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.89553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.89571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.89584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.89596: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.89610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.89694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.89716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.89740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.89876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.91712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.91791: stderr chunk (state=3): >>><<< 25201 1726882686.91802: stdout chunk (state=3): >>><<< 25201 1726882686.91870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.91878: handler run complete 25201 1726882686.91881: Evaluated conditional (False): False 25201 1726882686.91883: attempt loop complete, returning result 25201 1726882686.92074: variable 'item' from source: unknown 25201 1726882686.92077: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003844", "end": "2024-09-20 21:38:06.871402", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:38:06.867558" } 25201 1726882686.92209: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.92241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.92270: variable 'omit' from source: magic vars 25201 1726882686.92547: variable 'ansible_distribution_major_version' from source: facts 25201 1726882686.92565: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882686.92774: variable 'type' from source: play vars 25201 1726882686.92784: variable 'state' from source: include params 25201 1726882686.92793: variable 'interface' from source: play vars 25201 1726882686.92801: variable 'current_interfaces' from source: set_fact 25201 1726882686.92810: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25201 1726882686.92818: variable 'omit' from source: magic vars 25201 1726882686.92855: variable 'omit' from source: magic vars 25201 1726882686.92901: variable 'item' from source: unknown 25201 1726882686.92985: variable 'item' from source: unknown 25201 1726882686.93010: variable 'omit' from source: magic vars 25201 1726882686.93041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882686.93063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.93078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882686.93103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882686.93111: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.93119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.93229: Set connection var ansible_shell_executable to /bin/sh 25201 1726882686.93241: Set connection var ansible_pipelining to False 25201 1726882686.93279: Set connection var ansible_connection to ssh 25201 1726882686.93299: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882686.93307: Set connection var ansible_shell_type to sh 25201 1726882686.93319: Set connection var ansible_timeout to 10 25201 1726882686.93342: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.93350: variable 'ansible_connection' from source: unknown 25201 1726882686.93357: variable 'ansible_module_compression' from source: unknown 25201 1726882686.93365: variable 'ansible_shell_type' from source: unknown 25201 1726882686.93373: variable 'ansible_shell_executable' from source: unknown 25201 1726882686.93379: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882686.93396: variable 'ansible_pipelining' from source: unknown 25201 1726882686.93406: variable 'ansible_timeout' from source: unknown 25201 1726882686.93415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882686.93537: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882686.93551: variable 'omit' from source: magic vars 25201 1726882686.93559: starting attempt loop 25201 1726882686.93568: running the handler 25201 1726882686.93579: _low_level_execute_command(): starting 25201 1726882686.93587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882686.94268: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.94286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.94302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.94321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.94366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.94388: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.94404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.94422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.94434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.94446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.94458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.94474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.94501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.94514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.94525: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.94539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.94626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.94644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.94659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.94794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882686.96402: stdout chunk (state=3): >>>/root <<< 25201 1726882686.96556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882686.96560: stderr chunk (state=3): >>><<< 25201 1726882686.96568: stdout chunk (state=3): >>><<< 25201 1726882686.96586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882686.96594: _low_level_execute_command(): starting 25201 1726882686.96601: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411 `" && echo ansible-tmp-1726882686.9658558-25546-276275603587411="` echo /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411 `" ) && sleep 0' 25201 1726882686.98152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882686.98169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.98184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.98199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.98244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.98255: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882686.98270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.98291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882686.98303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882686.98313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882686.98323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882686.98337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882686.98355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882686.98369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882686.98380: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882686.98392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882686.98474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882686.98490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882686.98503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882686.98632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.00518: stdout chunk (state=3): >>>ansible-tmp-1726882686.9658558-25546-276275603587411=/root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411 <<< 25201 1726882687.00630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.00698: stderr chunk (state=3): >>><<< 25201 1726882687.00702: stdout chunk (state=3): >>><<< 25201 1726882687.00776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882686.9658558-25546-276275603587411=/root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.00779: variable 'ansible_module_compression' from source: unknown 25201 1726882687.00970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882687.00972: variable 'ansible_facts' from source: unknown 25201 1726882687.00975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/AnsiballZ_command.py 25201 1726882687.01104: Sending initial data 25201 1726882687.01107: Sent initial data (156 bytes) 25201 1726882687.02071: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.02082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.02120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882687.02123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.02126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.02196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.02200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.02313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.04091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882687.04188: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882687.04294: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp26b57z8x /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/AnsiballZ_command.py <<< 25201 1726882687.04387: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882687.05892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.05998: stderr chunk (state=3): >>><<< 25201 1726882687.06001: stdout chunk (state=3): >>><<< 25201 1726882687.06017: done transferring module to remote 25201 1726882687.06027: _low_level_execute_command(): starting 25201 1726882687.06036: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/ /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/AnsiballZ_command.py && sleep 0' 25201 1726882687.06454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.06462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.06511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882687.06514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.06516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.06518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.06573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.06577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.06685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.08635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.09345: stderr chunk (state=3): >>><<< 25201 1726882687.09349: stdout chunk (state=3): >>><<< 25201 1726882687.09352: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.09354: _low_level_execute_command(): starting 25201 1726882687.09357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/AnsiballZ_command.py && sleep 0' 25201 1726882687.09359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882687.09361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.09368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.09388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.09634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.09638: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882687.09640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.09642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882687.09643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882687.09645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882687.09647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.09649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.09653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.09655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.09656: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882687.09658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.09660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.09663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882687.09665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.09743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.23499: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:38:07.225885", "end": "2024-09-20 21:38:07.231335", "delta": "0:00:00.005450", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882687.24573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882687.24623: stderr chunk (state=3): >>><<< 25201 1726882687.24627: stdout chunk (state=3): >>><<< 25201 1726882687.24640: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:38:07.225885", "end": "2024-09-20 21:38:07.231335", "delta": "0:00:00.005450", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882687.24668: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882687.24676: _low_level_execute_command(): starting 25201 1726882687.24679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882686.9658558-25546-276275603587411/ > /dev/null 2>&1 && sleep 0' 25201 1726882687.25100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.25103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.25139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.25142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.25145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.25198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.25201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.25302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.27107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.27150: stderr chunk (state=3): >>><<< 25201 1726882687.27154: stdout chunk (state=3): >>><<< 25201 1726882687.27170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.27173: handler run complete 25201 1726882687.27188: Evaluated conditional (False): False 25201 1726882687.27195: attempt loop complete, returning result 25201 1726882687.27209: variable 'item' from source: unknown 25201 1726882687.27271: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.005450", "end": "2024-09-20 21:38:07.231335", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:38:07.225885" } 25201 1726882687.27388: dumping result to json 25201 1726882687.27391: done dumping result, returning 25201 1726882687.27393: done running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 [0e448fcc-3ce9-313b-197e-00000000015a] 25201 1726882687.27396: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015a 25201 1726882687.27440: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015a 25201 1726882687.27442: WORKER PROCESS EXITING 25201 1726882687.27503: no more pending results, returning what we have 25201 1726882687.27506: results queue empty 25201 1726882687.27507: checking for any_errors_fatal 25201 1726882687.27512: done checking for any_errors_fatal 25201 1726882687.27513: checking for max_fail_percentage 25201 1726882687.27514: done checking for max_fail_percentage 25201 1726882687.27515: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.27516: done checking to see if all hosts have failed 25201 1726882687.27516: getting the remaining hosts for this loop 25201 1726882687.27519: done getting the remaining hosts for this loop 25201 1726882687.27522: getting the next task for host managed_node2 25201 1726882687.27527: done getting next task for host managed_node2 25201 1726882687.27529: ^ task is: TASK: Set up veth as managed by NetworkManager 25201 1726882687.27532: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.27534: getting variables 25201 1726882687.27536: in VariableManager get_vars() 25201 1726882687.27581: Calling all_inventory to load vars for managed_node2 25201 1726882687.27584: Calling groups_inventory to load vars for managed_node2 25201 1726882687.27586: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.27595: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.27598: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.27600: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.27744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.27865: done with get_vars() 25201 1726882687.27873: done getting variables 25201 1726882687.27916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:38:07 -0400 (0:00:01.164) 0:00:08.454 ****** 25201 1726882687.27935: entering _queue_task() for managed_node2/command 25201 1726882687.28112: worker is 1 (out of 1 available) 25201 1726882687.28122: exiting _queue_task() for managed_node2/command 25201 1726882687.28132: done queuing things up, now waiting for results queue to drain 25201 1726882687.28134: waiting for pending results... 25201 1726882687.28290: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 25201 1726882687.28351: in run() - task 0e448fcc-3ce9-313b-197e-00000000015b 25201 1726882687.28365: variable 'ansible_search_path' from source: unknown 25201 1726882687.28370: variable 'ansible_search_path' from source: unknown 25201 1726882687.28394: calling self._execute() 25201 1726882687.28456: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.28464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.28474: variable 'omit' from source: magic vars 25201 1726882687.28722: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.28731: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.28834: variable 'type' from source: play vars 25201 1726882687.28837: variable 'state' from source: include params 25201 1726882687.28842: Evaluated conditional (type == 'veth' and state == 'present'): True 25201 1726882687.28848: variable 'omit' from source: magic vars 25201 1726882687.28877: variable 'omit' from source: magic vars 25201 1726882687.28943: variable 'interface' from source: play vars 25201 1726882687.28956: variable 'omit' from source: magic vars 25201 1726882687.28993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882687.29016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882687.29031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882687.29043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882687.29053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882687.29079: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882687.29082: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.29084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.29151: Set connection var ansible_shell_executable to /bin/sh 25201 1726882687.29155: Set connection var ansible_pipelining to False 25201 1726882687.29160: Set connection var ansible_connection to ssh 25201 1726882687.29167: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882687.29174: Set connection var ansible_shell_type to sh 25201 1726882687.29181: Set connection var ansible_timeout to 10 25201 1726882687.29197: variable 'ansible_shell_executable' from source: unknown 25201 1726882687.29200: variable 'ansible_connection' from source: unknown 25201 1726882687.29208: variable 'ansible_module_compression' from source: unknown 25201 1726882687.29211: variable 'ansible_shell_type' from source: unknown 25201 1726882687.29215: variable 'ansible_shell_executable' from source: unknown 25201 1726882687.29218: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.29220: variable 'ansible_pipelining' from source: unknown 25201 1726882687.29222: variable 'ansible_timeout' from source: unknown 25201 1726882687.29226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.29320: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882687.29323: variable 'omit' from source: magic vars 25201 1726882687.29326: starting attempt loop 25201 1726882687.29329: running the handler 25201 1726882687.29341: _low_level_execute_command(): starting 25201 1726882687.29349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882687.29829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.29845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.29859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.29881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.29922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.29934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.30042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.31622: stdout chunk (state=3): >>>/root <<< 25201 1726882687.31724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.31769: stderr chunk (state=3): >>><<< 25201 1726882687.31772: stdout chunk (state=3): >>><<< 25201 1726882687.31791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.31806: _low_level_execute_command(): starting 25201 1726882687.31811: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693 `" && echo ansible-tmp-1726882687.3179111-25610-679002244693="` echo /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693 `" ) && sleep 0' 25201 1726882687.32237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.32250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.32274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882687.32286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.32323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.32347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.32442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.34316: stdout chunk (state=3): >>>ansible-tmp-1726882687.3179111-25610-679002244693=/root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693 <<< 25201 1726882687.34466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.34476: stdout chunk (state=3): >>><<< 25201 1726882687.34488: stderr chunk (state=3): >>><<< 25201 1726882687.34515: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882687.3179111-25610-679002244693=/root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.34548: variable 'ansible_module_compression' from source: unknown 25201 1726882687.34603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882687.34649: variable 'ansible_facts' from source: unknown 25201 1726882687.34743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/AnsiballZ_command.py 25201 1726882687.34889: Sending initial data 25201 1726882687.34891: Sent initial data (153 bytes) 25201 1726882687.35746: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.35749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.35786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.35789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.35792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.35843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.35846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.35948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.37687: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882687.37784: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882687.37896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpveqig0af /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/AnsiballZ_command.py <<< 25201 1726882687.37995: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882687.39297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.39469: stderr chunk (state=3): >>><<< 25201 1726882687.39472: stdout chunk (state=3): >>><<< 25201 1726882687.39561: done transferring module to remote 25201 1726882687.39565: _low_level_execute_command(): starting 25201 1726882687.39568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/ /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/AnsiballZ_command.py && sleep 0' 25201 1726882687.40145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882687.40157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.40173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.40191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.40266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.40286: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882687.40300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.40319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882687.40335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882687.40354: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882687.40382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.40396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.40410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.40424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.40440: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882687.40453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.40530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.40560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882687.40587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.40715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.42569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.42573: stdout chunk (state=3): >>><<< 25201 1726882687.42575: stderr chunk (state=3): >>><<< 25201 1726882687.42665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.42670: _low_level_execute_command(): starting 25201 1726882687.42674: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/AnsiballZ_command.py && sleep 0' 25201 1726882687.43313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882687.43460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.44277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.44340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.44351: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882687.44366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.44384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882687.44396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882687.44406: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882687.44422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.44434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.44448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.44458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.44471: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882687.44484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.44562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.44588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882687.44616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.44762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.59845: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:38:07.575614", "end": "2024-09-20 21:38:07.596176", "delta": "0:00:00.020562", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882687.61123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882687.61127: stdout chunk (state=3): >>><<< 25201 1726882687.61129: stderr chunk (state=3): >>><<< 25201 1726882687.61279: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:38:07.575614", "end": "2024-09-20 21:38:07.596176", "delta": "0:00:00.020562", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882687.61283: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882687.61286: _low_level_execute_command(): starting 25201 1726882687.61292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882687.3179111-25610-679002244693/ > /dev/null 2>&1 && sleep 0' 25201 1726882687.62737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.62741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.62786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882687.62789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.62791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882687.62793: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.62846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.62857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.63481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.64777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.64844: stderr chunk (state=3): >>><<< 25201 1726882687.64847: stdout chunk (state=3): >>><<< 25201 1726882687.65072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.65076: handler run complete 25201 1726882687.65078: Evaluated conditional (False): False 25201 1726882687.65080: attempt loop complete, returning result 25201 1726882687.65082: _execute() done 25201 1726882687.65084: dumping result to json 25201 1726882687.65086: done dumping result, returning 25201 1726882687.65088: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-313b-197e-00000000015b] 25201 1726882687.65090: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015b 25201 1726882687.65160: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015b 25201 1726882687.65167: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.020562", "end": "2024-09-20 21:38:07.596176", "rc": 0, "start": "2024-09-20 21:38:07.575614" } 25201 1726882687.65243: no more pending results, returning what we have 25201 1726882687.65246: results queue empty 25201 1726882687.65247: checking for any_errors_fatal 25201 1726882687.65261: done checking for any_errors_fatal 25201 1726882687.65265: checking for max_fail_percentage 25201 1726882687.65267: done checking for max_fail_percentage 25201 1726882687.65268: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.65269: done checking to see if all hosts have failed 25201 1726882687.65270: getting the remaining hosts for this loop 25201 1726882687.65272: done getting the remaining hosts for this loop 25201 1726882687.65276: getting the next task for host managed_node2 25201 1726882687.65283: done getting next task for host managed_node2 25201 1726882687.65285: ^ task is: TASK: Delete veth interface {{ interface }} 25201 1726882687.65288: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.65292: getting variables 25201 1726882687.65294: in VariableManager get_vars() 25201 1726882687.65334: Calling all_inventory to load vars for managed_node2 25201 1726882687.65338: Calling groups_inventory to load vars for managed_node2 25201 1726882687.65340: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.65352: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.65355: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.65357: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.65533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.65911: done with get_vars() 25201 1726882687.65924: done getting variables 25201 1726882687.65990: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882687.66413: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:38:07 -0400 (0:00:00.385) 0:00:08.839 ****** 25201 1726882687.66443: entering _queue_task() for managed_node2/command 25201 1726882687.66906: worker is 1 (out of 1 available) 25201 1726882687.66918: exiting _queue_task() for managed_node2/command 25201 1726882687.66930: done queuing things up, now waiting for results queue to drain 25201 1726882687.66932: waiting for pending results... 25201 1726882687.67577: running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 25201 1726882687.67917: in run() - task 0e448fcc-3ce9-313b-197e-00000000015c 25201 1726882687.67938: variable 'ansible_search_path' from source: unknown 25201 1726882687.67949: variable 'ansible_search_path' from source: unknown 25201 1726882687.67993: calling self._execute() 25201 1726882687.68089: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.68251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.68270: variable 'omit' from source: magic vars 25201 1726882687.68948: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.68968: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.69591: variable 'type' from source: play vars 25201 1726882687.69601: variable 'state' from source: include params 25201 1726882687.69609: variable 'interface' from source: play vars 25201 1726882687.69617: variable 'current_interfaces' from source: set_fact 25201 1726882687.69628: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 25201 1726882687.69634: when evaluation is False, skipping this task 25201 1726882687.69641: _execute() done 25201 1726882687.69647: dumping result to json 25201 1726882687.69656: done dumping result, returning 25201 1726882687.69671: done running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 [0e448fcc-3ce9-313b-197e-00000000015c] 25201 1726882687.69682: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015c skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882687.69912: no more pending results, returning what we have 25201 1726882687.69916: results queue empty 25201 1726882687.69917: checking for any_errors_fatal 25201 1726882687.69925: done checking for any_errors_fatal 25201 1726882687.69926: checking for max_fail_percentage 25201 1726882687.69928: done checking for max_fail_percentage 25201 1726882687.69928: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.69929: done checking to see if all hosts have failed 25201 1726882687.69930: getting the remaining hosts for this loop 25201 1726882687.69931: done getting the remaining hosts for this loop 25201 1726882687.69935: getting the next task for host managed_node2 25201 1726882687.69941: done getting next task for host managed_node2 25201 1726882687.69944: ^ task is: TASK: Create dummy interface {{ interface }} 25201 1726882687.69947: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.69950: getting variables 25201 1726882687.69952: in VariableManager get_vars() 25201 1726882687.69996: Calling all_inventory to load vars for managed_node2 25201 1726882687.69999: Calling groups_inventory to load vars for managed_node2 25201 1726882687.70002: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.70014: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.70017: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.70019: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.70211: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015c 25201 1726882687.70214: WORKER PROCESS EXITING 25201 1726882687.70234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.70443: done with get_vars() 25201 1726882687.70452: done getting variables 25201 1726882687.70508: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882687.70614: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:38:07 -0400 (0:00:00.041) 0:00:08.881 ****** 25201 1726882687.70641: entering _queue_task() for managed_node2/command 25201 1726882687.71639: worker is 1 (out of 1 available) 25201 1726882687.71653: exiting _queue_task() for managed_node2/command 25201 1726882687.71667: done queuing things up, now waiting for results queue to drain 25201 1726882687.71669: waiting for pending results... 25201 1726882687.72512: running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 25201 1726882687.72592: in run() - task 0e448fcc-3ce9-313b-197e-00000000015d 25201 1726882687.72719: variable 'ansible_search_path' from source: unknown 25201 1726882687.72724: variable 'ansible_search_path' from source: unknown 25201 1726882687.72753: calling self._execute() 25201 1726882687.72946: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.72951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.72961: variable 'omit' from source: magic vars 25201 1726882687.74731: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.74747: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.74970: variable 'type' from source: play vars 25201 1726882687.74981: variable 'state' from source: include params 25201 1726882687.74990: variable 'interface' from source: play vars 25201 1726882687.75000: variable 'current_interfaces' from source: set_fact 25201 1726882687.75012: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25201 1726882687.75024: when evaluation is False, skipping this task 25201 1726882687.75031: _execute() done 25201 1726882687.75037: dumping result to json 25201 1726882687.75043: done dumping result, returning 25201 1726882687.75051: done running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 [0e448fcc-3ce9-313b-197e-00000000015d] 25201 1726882687.75060: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015d skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882687.75205: no more pending results, returning what we have 25201 1726882687.75209: results queue empty 25201 1726882687.75210: checking for any_errors_fatal 25201 1726882687.75217: done checking for any_errors_fatal 25201 1726882687.75218: checking for max_fail_percentage 25201 1726882687.75220: done checking for max_fail_percentage 25201 1726882687.75221: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.75221: done checking to see if all hosts have failed 25201 1726882687.75222: getting the remaining hosts for this loop 25201 1726882687.75224: done getting the remaining hosts for this loop 25201 1726882687.75228: getting the next task for host managed_node2 25201 1726882687.75234: done getting next task for host managed_node2 25201 1726882687.75236: ^ task is: TASK: Delete dummy interface {{ interface }} 25201 1726882687.75240: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.75244: getting variables 25201 1726882687.75245: in VariableManager get_vars() 25201 1726882687.75290: Calling all_inventory to load vars for managed_node2 25201 1726882687.75293: Calling groups_inventory to load vars for managed_node2 25201 1726882687.75295: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.75308: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.75311: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.75314: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.75515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.75747: done with get_vars() 25201 1726882687.75759: done getting variables 25201 1726882687.75837: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882687.76056: variable 'interface' from source: play vars 25201 1726882687.76203: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015d 25201 1726882687.76207: WORKER PROCESS EXITING TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:38:07 -0400 (0:00:00.055) 0:00:08.936 ****** 25201 1726882687.76218: entering _queue_task() for managed_node2/command 25201 1726882687.76515: worker is 1 (out of 1 available) 25201 1726882687.76528: exiting _queue_task() for managed_node2/command 25201 1726882687.76541: done queuing things up, now waiting for results queue to drain 25201 1726882687.76542: waiting for pending results... 25201 1726882687.78097: running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 25201 1726882687.78195: in run() - task 0e448fcc-3ce9-313b-197e-00000000015e 25201 1726882687.78215: variable 'ansible_search_path' from source: unknown 25201 1726882687.78223: variable 'ansible_search_path' from source: unknown 25201 1726882687.78262: calling self._execute() 25201 1726882687.78343: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.78352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.78365: variable 'omit' from source: magic vars 25201 1726882687.78712: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.78734: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.79173: variable 'type' from source: play vars 25201 1726882687.79229: variable 'state' from source: include params 25201 1726882687.79278: variable 'interface' from source: play vars 25201 1726882687.79290: variable 'current_interfaces' from source: set_fact 25201 1726882687.79302: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25201 1726882687.79309: when evaluation is False, skipping this task 25201 1726882687.79317: _execute() done 25201 1726882687.79323: dumping result to json 25201 1726882687.79477: done dumping result, returning 25201 1726882687.79488: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-313b-197e-00000000015e] 25201 1726882687.79500: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015e 25201 1726882687.79601: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015e 25201 1726882687.79608: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882687.79658: no more pending results, returning what we have 25201 1726882687.79661: results queue empty 25201 1726882687.79666: checking for any_errors_fatal 25201 1726882687.79672: done checking for any_errors_fatal 25201 1726882687.79673: checking for max_fail_percentage 25201 1726882687.79676: done checking for max_fail_percentage 25201 1726882687.79677: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.79677: done checking to see if all hosts have failed 25201 1726882687.79678: getting the remaining hosts for this loop 25201 1726882687.79679: done getting the remaining hosts for this loop 25201 1726882687.79683: getting the next task for host managed_node2 25201 1726882687.79689: done getting next task for host managed_node2 25201 1726882687.79691: ^ task is: TASK: Create tap interface {{ interface }} 25201 1726882687.79694: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.79698: getting variables 25201 1726882687.79700: in VariableManager get_vars() 25201 1726882687.79738: Calling all_inventory to load vars for managed_node2 25201 1726882687.79741: Calling groups_inventory to load vars for managed_node2 25201 1726882687.79744: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.79755: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.79758: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.79760: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.79982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.80199: done with get_vars() 25201 1726882687.80209: done getting variables 25201 1726882687.80270: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882687.80489: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:38:07 -0400 (0:00:00.044) 0:00:08.980 ****** 25201 1726882687.80631: entering _queue_task() for managed_node2/command 25201 1726882687.81085: worker is 1 (out of 1 available) 25201 1726882687.81097: exiting _queue_task() for managed_node2/command 25201 1726882687.81108: done queuing things up, now waiting for results queue to drain 25201 1726882687.81109: waiting for pending results... 25201 1726882687.81936: running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 25201 1726882687.82159: in run() - task 0e448fcc-3ce9-313b-197e-00000000015f 25201 1726882687.82180: variable 'ansible_search_path' from source: unknown 25201 1726882687.82277: variable 'ansible_search_path' from source: unknown 25201 1726882687.82316: calling self._execute() 25201 1726882687.82399: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.82477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.82490: variable 'omit' from source: magic vars 25201 1726882687.82811: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.83582: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.83783: variable 'type' from source: play vars 25201 1726882687.83792: variable 'state' from source: include params 25201 1726882687.83800: variable 'interface' from source: play vars 25201 1726882687.83807: variable 'current_interfaces' from source: set_fact 25201 1726882687.83819: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25201 1726882687.83826: when evaluation is False, skipping this task 25201 1726882687.83832: _execute() done 25201 1726882687.83839: dumping result to json 25201 1726882687.83846: done dumping result, returning 25201 1726882687.83855: done running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 [0e448fcc-3ce9-313b-197e-00000000015f] 25201 1726882687.83869: sending task result for task 0e448fcc-3ce9-313b-197e-00000000015f 25201 1726882687.83960: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000015f skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882687.84011: no more pending results, returning what we have 25201 1726882687.84014: results queue empty 25201 1726882687.84015: checking for any_errors_fatal 25201 1726882687.84021: done checking for any_errors_fatal 25201 1726882687.84022: checking for max_fail_percentage 25201 1726882687.84023: done checking for max_fail_percentage 25201 1726882687.84024: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.84025: done checking to see if all hosts have failed 25201 1726882687.84025: getting the remaining hosts for this loop 25201 1726882687.84027: done getting the remaining hosts for this loop 25201 1726882687.84030: getting the next task for host managed_node2 25201 1726882687.84037: done getting next task for host managed_node2 25201 1726882687.84038: ^ task is: TASK: Delete tap interface {{ interface }} 25201 1726882687.84042: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.84045: getting variables 25201 1726882687.84046: in VariableManager get_vars() 25201 1726882687.84090: Calling all_inventory to load vars for managed_node2 25201 1726882687.84093: Calling groups_inventory to load vars for managed_node2 25201 1726882687.84096: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.84106: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.84109: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.84112: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.84280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.84488: done with get_vars() 25201 1726882687.84497: done getting variables 25201 1726882687.84527: WORKER PROCESS EXITING 25201 1726882687.84560: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882687.84783: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:38:07 -0400 (0:00:00.042) 0:00:09.023 ****** 25201 1726882687.84924: entering _queue_task() for managed_node2/command 25201 1726882687.85373: worker is 1 (out of 1 available) 25201 1726882687.85385: exiting _queue_task() for managed_node2/command 25201 1726882687.85397: done queuing things up, now waiting for results queue to drain 25201 1726882687.85398: waiting for pending results... 25201 1726882687.86088: running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 25201 1726882687.86183: in run() - task 0e448fcc-3ce9-313b-197e-000000000160 25201 1726882687.86688: variable 'ansible_search_path' from source: unknown 25201 1726882687.86697: variable 'ansible_search_path' from source: unknown 25201 1726882687.86734: calling self._execute() 25201 1726882687.86816: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.86829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.86842: variable 'omit' from source: magic vars 25201 1726882687.87156: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.87286: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.87665: variable 'type' from source: play vars 25201 1726882687.87678: variable 'state' from source: include params 25201 1726882687.87688: variable 'interface' from source: play vars 25201 1726882687.87697: variable 'current_interfaces' from source: set_fact 25201 1726882687.87727: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25201 1726882687.87736: when evaluation is False, skipping this task 25201 1726882687.87774: _execute() done 25201 1726882687.87781: dumping result to json 25201 1726882687.87789: done dumping result, returning 25201 1726882687.87799: done running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 [0e448fcc-3ce9-313b-197e-000000000160] 25201 1726882687.87889: sending task result for task 0e448fcc-3ce9-313b-197e-000000000160 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882687.88031: no more pending results, returning what we have 25201 1726882687.88035: results queue empty 25201 1726882687.88036: checking for any_errors_fatal 25201 1726882687.88042: done checking for any_errors_fatal 25201 1726882687.88043: checking for max_fail_percentage 25201 1726882687.88045: done checking for max_fail_percentage 25201 1726882687.88045: checking to see if all hosts have failed and the running result is not ok 25201 1726882687.88046: done checking to see if all hosts have failed 25201 1726882687.88047: getting the remaining hosts for this loop 25201 1726882687.88048: done getting the remaining hosts for this loop 25201 1726882687.88054: getting the next task for host managed_node2 25201 1726882687.88061: done getting next task for host managed_node2 25201 1726882687.88066: ^ task is: TASK: Set up gateway ip on veth peer 25201 1726882687.88069: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882687.88072: getting variables 25201 1726882687.88074: in VariableManager get_vars() 25201 1726882687.88111: Calling all_inventory to load vars for managed_node2 25201 1726882687.88114: Calling groups_inventory to load vars for managed_node2 25201 1726882687.88116: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882687.88129: Calling all_plugins_play to load vars for managed_node2 25201 1726882687.88132: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882687.88136: Calling groups_plugins_play to load vars for managed_node2 25201 1726882687.88353: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000160 25201 1726882687.88361: WORKER PROCESS EXITING 25201 1726882687.88376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882687.88586: done with get_vars() 25201 1726882687.88595: done getting variables 25201 1726882687.88801: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Friday 20 September 2024 21:38:07 -0400 (0:00:00.039) 0:00:09.062 ****** 25201 1726882687.88827: entering _queue_task() for managed_node2/shell 25201 1726882687.88829: Creating lock for shell 25201 1726882687.89325: worker is 1 (out of 1 available) 25201 1726882687.89457: exiting _queue_task() for managed_node2/shell 25201 1726882687.89475: done queuing things up, now waiting for results queue to drain 25201 1726882687.89478: waiting for pending results... 25201 1726882687.90208: running TaskExecutor() for managed_node2/TASK: Set up gateway ip on veth peer 25201 1726882687.90555: in run() - task 0e448fcc-3ce9-313b-197e-00000000000d 25201 1726882687.90577: variable 'ansible_search_path' from source: unknown 25201 1726882687.90619: calling self._execute() 25201 1726882687.91347: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.91357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.91373: variable 'omit' from source: magic vars 25201 1726882687.91728: variable 'ansible_distribution_major_version' from source: facts 25201 1726882687.91746: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882687.91757: variable 'omit' from source: magic vars 25201 1726882687.91790: variable 'omit' from source: magic vars 25201 1726882687.91923: variable 'interface' from source: play vars 25201 1726882687.92588: variable 'omit' from source: magic vars 25201 1726882687.92632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882687.92675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882687.92699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882687.92720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882687.92737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882687.92773: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882687.92781: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.92789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.92888: Set connection var ansible_shell_executable to /bin/sh 25201 1726882687.92898: Set connection var ansible_pipelining to False 25201 1726882687.92906: Set connection var ansible_connection to ssh 25201 1726882687.92915: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882687.92921: Set connection var ansible_shell_type to sh 25201 1726882687.92931: Set connection var ansible_timeout to 10 25201 1726882687.92956: variable 'ansible_shell_executable' from source: unknown 25201 1726882687.92965: variable 'ansible_connection' from source: unknown 25201 1726882687.92973: variable 'ansible_module_compression' from source: unknown 25201 1726882687.92979: variable 'ansible_shell_type' from source: unknown 25201 1726882687.92985: variable 'ansible_shell_executable' from source: unknown 25201 1726882687.92991: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882687.92998: variable 'ansible_pipelining' from source: unknown 25201 1726882687.93003: variable 'ansible_timeout' from source: unknown 25201 1726882687.93010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882687.93142: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882687.93158: variable 'omit' from source: magic vars 25201 1726882687.93170: starting attempt loop 25201 1726882687.93176: running the handler 25201 1726882687.93187: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882687.93209: _low_level_execute_command(): starting 25201 1726882687.93219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882687.94981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882687.94986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.95018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882687.95022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882687.95024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882687.95027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.95193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.95253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882687.95256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.95380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882687.97037: stdout chunk (state=3): >>>/root <<< 25201 1726882687.97135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882687.97223: stderr chunk (state=3): >>><<< 25201 1726882687.97227: stdout chunk (state=3): >>><<< 25201 1726882687.97350: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882687.97353: _low_level_execute_command(): starting 25201 1726882687.97356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697 `" && echo ansible-tmp-1726882687.9725657-25641-27845072963697="` echo /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697 `" ) && sleep 0' 25201 1726882687.98301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882687.98850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882687.98882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882687.98911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25201 1726882687.98915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882687.98917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882687.99005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882687.99018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882687.99137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.01044: stdout chunk (state=3): >>>ansible-tmp-1726882687.9725657-25641-27845072963697=/root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697 <<< 25201 1726882688.01240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.01243: stdout chunk (state=3): >>><<< 25201 1726882688.01245: stderr chunk (state=3): >>><<< 25201 1726882688.01477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882687.9725657-25641-27845072963697=/root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.01480: variable 'ansible_module_compression' from source: unknown 25201 1726882688.01483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882688.01485: variable 'ansible_facts' from source: unknown 25201 1726882688.01493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/AnsiballZ_command.py 25201 1726882688.01860: Sending initial data 25201 1726882688.01875: Sent initial data (155 bytes) 25201 1726882688.02750: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.02753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.02796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882688.02799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.02801: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.02803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.02852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.02858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.02958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.04705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882688.04800: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882688.04901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp7uxyby9t /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/AnsiballZ_command.py <<< 25201 1726882688.04996: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882688.06338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.06470: stderr chunk (state=3): >>><<< 25201 1726882688.06473: stdout chunk (state=3): >>><<< 25201 1726882688.06475: done transferring module to remote 25201 1726882688.06477: _low_level_execute_command(): starting 25201 1726882688.06480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/ /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/AnsiballZ_command.py && sleep 0' 25201 1726882688.06905: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.06908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.06938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.06944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.06946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.06994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.07009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.07020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.07136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.09061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.09270: stderr chunk (state=3): >>><<< 25201 1726882688.09286: stdout chunk (state=3): >>><<< 25201 1726882688.09373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.09378: _low_level_execute_command(): starting 25201 1726882688.09380: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/AnsiballZ_command.py && sleep 0' 25201 1726882688.10294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882688.10316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.10334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.10354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.10402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.10424: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882688.10439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.10457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882688.10472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.10483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882688.10494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.10509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.10532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.10545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.10555: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882688.10570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.10655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.10679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.10697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.10897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.26562: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:38:08.238403", "end": "2024-09-20 21:38:08.263720", "delta": "0:00:00.025317", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882688.27789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882688.27841: stderr chunk (state=3): >>><<< 25201 1726882688.27844: stdout chunk (state=3): >>><<< 25201 1726882688.27860: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:38:08.238403", "end": "2024-09-20 21:38:08.263720", "delta": "0:00:00.025317", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882688.27896: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882688.27908: _low_level_execute_command(): starting 25201 1726882688.27911: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882687.9725657-25641-27845072963697/ > /dev/null 2>&1 && sleep 0' 25201 1726882688.28366: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.28371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.28411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882688.28414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.28417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882688.28419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.28473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.28477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.28479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.28586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.30426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.30475: stderr chunk (state=3): >>><<< 25201 1726882688.30478: stdout chunk (state=3): >>><<< 25201 1726882688.30491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.30497: handler run complete 25201 1726882688.30513: Evaluated conditional (False): False 25201 1726882688.30520: attempt loop complete, returning result 25201 1726882688.30523: _execute() done 25201 1726882688.30525: dumping result to json 25201 1726882688.30530: done dumping result, returning 25201 1726882688.30536: done running TaskExecutor() for managed_node2/TASK: Set up gateway ip on veth peer [0e448fcc-3ce9-313b-197e-00000000000d] 25201 1726882688.30541: sending task result for task 0e448fcc-3ce9-313b-197e-00000000000d 25201 1726882688.30636: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000000d 25201 1726882688.30639: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.025317", "end": "2024-09-20 21:38:08.263720", "rc": 0, "start": "2024-09-20 21:38:08.238403" } 25201 1726882688.30738: no more pending results, returning what we have 25201 1726882688.30740: results queue empty 25201 1726882688.30741: checking for any_errors_fatal 25201 1726882688.30746: done checking for any_errors_fatal 25201 1726882688.30746: checking for max_fail_percentage 25201 1726882688.30748: done checking for max_fail_percentage 25201 1726882688.30749: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.30749: done checking to see if all hosts have failed 25201 1726882688.30750: getting the remaining hosts for this loop 25201 1726882688.30751: done getting the remaining hosts for this loop 25201 1726882688.30755: getting the next task for host managed_node2 25201 1726882688.30760: done getting next task for host managed_node2 25201 1726882688.30767: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 25201 1726882688.30769: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.30772: getting variables 25201 1726882688.30773: in VariableManager get_vars() 25201 1726882688.30810: Calling all_inventory to load vars for managed_node2 25201 1726882688.30813: Calling groups_inventory to load vars for managed_node2 25201 1726882688.30815: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.30824: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.30826: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.30828: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.30940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.31057: done with get_vars() 25201 1726882688.31068: done getting variables 25201 1726882688.31111: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Friday 20 September 2024 21:38:08 -0400 (0:00:00.423) 0:00:09.485 ****** 25201 1726882688.31130: entering _queue_task() for managed_node2/debug 25201 1726882688.31312: worker is 1 (out of 1 available) 25201 1726882688.31325: exiting _queue_task() for managed_node2/debug 25201 1726882688.31335: done queuing things up, now waiting for results queue to drain 25201 1726882688.31337: waiting for pending results... 25201 1726882688.31488: running TaskExecutor() for managed_node2/TASK: TEST: I can configure an interface with static ipv6 config 25201 1726882688.31546: in run() - task 0e448fcc-3ce9-313b-197e-00000000000f 25201 1726882688.31558: variable 'ansible_search_path' from source: unknown 25201 1726882688.31589: calling self._execute() 25201 1726882688.31658: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.31661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.31675: variable 'omit' from source: magic vars 25201 1726882688.31984: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.31995: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.31998: variable 'omit' from source: magic vars 25201 1726882688.32014: variable 'omit' from source: magic vars 25201 1726882688.32042: variable 'omit' from source: magic vars 25201 1726882688.32075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882688.32100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882688.32117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882688.32129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882688.32141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882688.32170: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882688.32173: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.32175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.32240: Set connection var ansible_shell_executable to /bin/sh 25201 1726882688.32244: Set connection var ansible_pipelining to False 25201 1726882688.32251: Set connection var ansible_connection to ssh 25201 1726882688.32255: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882688.32261: Set connection var ansible_shell_type to sh 25201 1726882688.32272: Set connection var ansible_timeout to 10 25201 1726882688.32288: variable 'ansible_shell_executable' from source: unknown 25201 1726882688.32291: variable 'ansible_connection' from source: unknown 25201 1726882688.32293: variable 'ansible_module_compression' from source: unknown 25201 1726882688.32296: variable 'ansible_shell_type' from source: unknown 25201 1726882688.32298: variable 'ansible_shell_executable' from source: unknown 25201 1726882688.32300: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.32303: variable 'ansible_pipelining' from source: unknown 25201 1726882688.32305: variable 'ansible_timeout' from source: unknown 25201 1726882688.32310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.32414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882688.32423: variable 'omit' from source: magic vars 25201 1726882688.32426: starting attempt loop 25201 1726882688.32428: running the handler 25201 1726882688.32470: handler run complete 25201 1726882688.32482: attempt loop complete, returning result 25201 1726882688.32485: _execute() done 25201 1726882688.32487: dumping result to json 25201 1726882688.32490: done dumping result, returning 25201 1726882688.32496: done running TaskExecutor() for managed_node2/TASK: TEST: I can configure an interface with static ipv6 config [0e448fcc-3ce9-313b-197e-00000000000f] 25201 1726882688.32501: sending task result for task 0e448fcc-3ce9-313b-197e-00000000000f 25201 1726882688.32587: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000000f 25201 1726882688.32590: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 25201 1726882688.32632: no more pending results, returning what we have 25201 1726882688.32635: results queue empty 25201 1726882688.32636: checking for any_errors_fatal 25201 1726882688.32640: done checking for any_errors_fatal 25201 1726882688.32641: checking for max_fail_percentage 25201 1726882688.32643: done checking for max_fail_percentage 25201 1726882688.32643: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.32644: done checking to see if all hosts have failed 25201 1726882688.32645: getting the remaining hosts for this loop 25201 1726882688.32646: done getting the remaining hosts for this loop 25201 1726882688.32649: getting the next task for host managed_node2 25201 1726882688.32654: done getting next task for host managed_node2 25201 1726882688.32658: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25201 1726882688.32661: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.32684: getting variables 25201 1726882688.32685: in VariableManager get_vars() 25201 1726882688.32714: Calling all_inventory to load vars for managed_node2 25201 1726882688.32716: Calling groups_inventory to load vars for managed_node2 25201 1726882688.32718: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.32723: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.32725: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.32726: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.32860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.32981: done with get_vars() 25201 1726882688.32988: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:08 -0400 (0:00:00.019) 0:00:09.505 ****** 25201 1726882688.33049: entering _queue_task() for managed_node2/include_tasks 25201 1726882688.33209: worker is 1 (out of 1 available) 25201 1726882688.33222: exiting _queue_task() for managed_node2/include_tasks 25201 1726882688.33234: done queuing things up, now waiting for results queue to drain 25201 1726882688.33235: waiting for pending results... 25201 1726882688.33385: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25201 1726882688.33462: in run() - task 0e448fcc-3ce9-313b-197e-000000000017 25201 1726882688.33477: variable 'ansible_search_path' from source: unknown 25201 1726882688.33481: variable 'ansible_search_path' from source: unknown 25201 1726882688.33506: calling self._execute() 25201 1726882688.33559: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.33564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.33576: variable 'omit' from source: magic vars 25201 1726882688.33812: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.33822: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.33827: _execute() done 25201 1726882688.33830: dumping result to json 25201 1726882688.33833: done dumping result, returning 25201 1726882688.33839: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-313b-197e-000000000017] 25201 1726882688.33844: sending task result for task 0e448fcc-3ce9-313b-197e-000000000017 25201 1726882688.33930: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000017 25201 1726882688.33933: WORKER PROCESS EXITING 25201 1726882688.33969: no more pending results, returning what we have 25201 1726882688.33973: in VariableManager get_vars() 25201 1726882688.34014: Calling all_inventory to load vars for managed_node2 25201 1726882688.34016: Calling groups_inventory to load vars for managed_node2 25201 1726882688.34018: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.34024: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.34025: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.34027: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.34133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.34248: done with get_vars() 25201 1726882688.34254: variable 'ansible_search_path' from source: unknown 25201 1726882688.34254: variable 'ansible_search_path' from source: unknown 25201 1726882688.34281: we have included files to process 25201 1726882688.34282: generating all_blocks data 25201 1726882688.34283: done generating all_blocks data 25201 1726882688.34286: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882688.34287: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882688.34288: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882688.34755: done processing included file 25201 1726882688.34757: iterating over new_blocks loaded from include file 25201 1726882688.34757: in VariableManager get_vars() 25201 1726882688.34774: done with get_vars() 25201 1726882688.34776: filtering new block on tags 25201 1726882688.34787: done filtering new block on tags 25201 1726882688.34788: in VariableManager get_vars() 25201 1726882688.34801: done with get_vars() 25201 1726882688.34802: filtering new block on tags 25201 1726882688.34813: done filtering new block on tags 25201 1726882688.34815: in VariableManager get_vars() 25201 1726882688.34827: done with get_vars() 25201 1726882688.34828: filtering new block on tags 25201 1726882688.34838: done filtering new block on tags 25201 1726882688.34839: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 25201 1726882688.34842: extending task lists for all hosts with included blocks 25201 1726882688.35315: done extending task lists 25201 1726882688.35316: done processing included files 25201 1726882688.35316: results queue empty 25201 1726882688.35317: checking for any_errors_fatal 25201 1726882688.35319: done checking for any_errors_fatal 25201 1726882688.35319: checking for max_fail_percentage 25201 1726882688.35320: done checking for max_fail_percentage 25201 1726882688.35320: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.35321: done checking to see if all hosts have failed 25201 1726882688.35321: getting the remaining hosts for this loop 25201 1726882688.35322: done getting the remaining hosts for this loop 25201 1726882688.35323: getting the next task for host managed_node2 25201 1726882688.35326: done getting next task for host managed_node2 25201 1726882688.35328: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25201 1726882688.35329: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.35335: getting variables 25201 1726882688.35336: in VariableManager get_vars() 25201 1726882688.35346: Calling all_inventory to load vars for managed_node2 25201 1726882688.35347: Calling groups_inventory to load vars for managed_node2 25201 1726882688.35348: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.35351: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.35352: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.35354: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.35449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.35561: done with get_vars() 25201 1726882688.35569: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:08 -0400 (0:00:00.025) 0:00:09.530 ****** 25201 1726882688.35613: entering _queue_task() for managed_node2/setup 25201 1726882688.35778: worker is 1 (out of 1 available) 25201 1726882688.35791: exiting _queue_task() for managed_node2/setup 25201 1726882688.35802: done queuing things up, now waiting for results queue to drain 25201 1726882688.35804: waiting for pending results... 25201 1726882688.35945: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25201 1726882688.36030: in run() - task 0e448fcc-3ce9-313b-197e-0000000001fc 25201 1726882688.36040: variable 'ansible_search_path' from source: unknown 25201 1726882688.36043: variable 'ansible_search_path' from source: unknown 25201 1726882688.36074: calling self._execute() 25201 1726882688.36124: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.36128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.36136: variable 'omit' from source: magic vars 25201 1726882688.36370: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.36380: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.36518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882688.37994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882688.38045: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882688.38074: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882688.38100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882688.38119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882688.38179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882688.38200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882688.38218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882688.38249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882688.38259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882688.38297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882688.38315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882688.38332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882688.38361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882688.38374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882688.38478: variable '__network_required_facts' from source: role '' defaults 25201 1726882688.38485: variable 'ansible_facts' from source: unknown 25201 1726882688.38539: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25201 1726882688.38543: when evaluation is False, skipping this task 25201 1726882688.38545: _execute() done 25201 1726882688.38548: dumping result to json 25201 1726882688.38552: done dumping result, returning 25201 1726882688.38554: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-313b-197e-0000000001fc] 25201 1726882688.38560: sending task result for task 0e448fcc-3ce9-313b-197e-0000000001fc 25201 1726882688.38637: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000001fc 25201 1726882688.38640: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882688.38684: no more pending results, returning what we have 25201 1726882688.38687: results queue empty 25201 1726882688.38688: checking for any_errors_fatal 25201 1726882688.38689: done checking for any_errors_fatal 25201 1726882688.38690: checking for max_fail_percentage 25201 1726882688.38691: done checking for max_fail_percentage 25201 1726882688.38692: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.38693: done checking to see if all hosts have failed 25201 1726882688.38694: getting the remaining hosts for this loop 25201 1726882688.38695: done getting the remaining hosts for this loop 25201 1726882688.38698: getting the next task for host managed_node2 25201 1726882688.38707: done getting next task for host managed_node2 25201 1726882688.38711: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25201 1726882688.38714: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.38726: getting variables 25201 1726882688.38728: in VariableManager get_vars() 25201 1726882688.38758: Calling all_inventory to load vars for managed_node2 25201 1726882688.38761: Calling groups_inventory to load vars for managed_node2 25201 1726882688.38763: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.38772: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.38775: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.38778: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.38881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.39001: done with get_vars() 25201 1726882688.39007: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:08 -0400 (0:00:00.034) 0:00:09.565 ****** 25201 1726882688.39072: entering _queue_task() for managed_node2/stat 25201 1726882688.39229: worker is 1 (out of 1 available) 25201 1726882688.39241: exiting _queue_task() for managed_node2/stat 25201 1726882688.39252: done queuing things up, now waiting for results queue to drain 25201 1726882688.39254: waiting for pending results... 25201 1726882688.39404: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25201 1726882688.39493: in run() - task 0e448fcc-3ce9-313b-197e-0000000001fe 25201 1726882688.39502: variable 'ansible_search_path' from source: unknown 25201 1726882688.39506: variable 'ansible_search_path' from source: unknown 25201 1726882688.39530: calling self._execute() 25201 1726882688.39591: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.39595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.39603: variable 'omit' from source: magic vars 25201 1726882688.39841: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.39851: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.39960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882688.40205: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882688.40240: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882688.40268: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882688.40293: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882688.40355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882688.40375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882688.40393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882688.40414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882688.40471: variable '__network_is_ostree' from source: set_fact 25201 1726882688.40477: Evaluated conditional (not __network_is_ostree is defined): False 25201 1726882688.40480: when evaluation is False, skipping this task 25201 1726882688.40482: _execute() done 25201 1726882688.40485: dumping result to json 25201 1726882688.40489: done dumping result, returning 25201 1726882688.40495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-313b-197e-0000000001fe] 25201 1726882688.40500: sending task result for task 0e448fcc-3ce9-313b-197e-0000000001fe 25201 1726882688.40579: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000001fe 25201 1726882688.40582: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25201 1726882688.40629: no more pending results, returning what we have 25201 1726882688.40632: results queue empty 25201 1726882688.40633: checking for any_errors_fatal 25201 1726882688.40637: done checking for any_errors_fatal 25201 1726882688.40638: checking for max_fail_percentage 25201 1726882688.40639: done checking for max_fail_percentage 25201 1726882688.40640: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.40641: done checking to see if all hosts have failed 25201 1726882688.40641: getting the remaining hosts for this loop 25201 1726882688.40643: done getting the remaining hosts for this loop 25201 1726882688.40646: getting the next task for host managed_node2 25201 1726882688.40651: done getting next task for host managed_node2 25201 1726882688.40654: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25201 1726882688.40657: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.40673: getting variables 25201 1726882688.40674: in VariableManager get_vars() 25201 1726882688.40706: Calling all_inventory to load vars for managed_node2 25201 1726882688.40708: Calling groups_inventory to load vars for managed_node2 25201 1726882688.40709: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.40715: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.40717: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.40718: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.40845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.40966: done with get_vars() 25201 1726882688.40972: done getting variables 25201 1726882688.41007: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:08 -0400 (0:00:00.019) 0:00:09.584 ****** 25201 1726882688.41031: entering _queue_task() for managed_node2/set_fact 25201 1726882688.41192: worker is 1 (out of 1 available) 25201 1726882688.41203: exiting _queue_task() for managed_node2/set_fact 25201 1726882688.41213: done queuing things up, now waiting for results queue to drain 25201 1726882688.41215: waiting for pending results... 25201 1726882688.41353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25201 1726882688.41445: in run() - task 0e448fcc-3ce9-313b-197e-0000000001ff 25201 1726882688.41474: variable 'ansible_search_path' from source: unknown 25201 1726882688.41481: variable 'ansible_search_path' from source: unknown 25201 1726882688.41514: calling self._execute() 25201 1726882688.41586: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.41598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.41610: variable 'omit' from source: magic vars 25201 1726882688.41940: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.41960: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.42123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882688.42373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882688.42419: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882688.42461: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882688.42499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882688.42585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882688.42615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882688.42649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882688.42683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882688.42772: variable '__network_is_ostree' from source: set_fact 25201 1726882688.42784: Evaluated conditional (not __network_is_ostree is defined): False 25201 1726882688.42792: when evaluation is False, skipping this task 25201 1726882688.42799: _execute() done 25201 1726882688.42806: dumping result to json 25201 1726882688.42812: done dumping result, returning 25201 1726882688.42826: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-313b-197e-0000000001ff] 25201 1726882688.42829: sending task result for task 0e448fcc-3ce9-313b-197e-0000000001ff 25201 1726882688.42927: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000001ff 25201 1726882688.42931: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25201 1726882688.42991: no more pending results, returning what we have 25201 1726882688.42993: results queue empty 25201 1726882688.42994: checking for any_errors_fatal 25201 1726882688.42999: done checking for any_errors_fatal 25201 1726882688.43000: checking for max_fail_percentage 25201 1726882688.43001: done checking for max_fail_percentage 25201 1726882688.43002: checking to see if all hosts have failed and the running result is not ok 25201 1726882688.43003: done checking to see if all hosts have failed 25201 1726882688.43004: getting the remaining hosts for this loop 25201 1726882688.43005: done getting the remaining hosts for this loop 25201 1726882688.43008: getting the next task for host managed_node2 25201 1726882688.43015: done getting next task for host managed_node2 25201 1726882688.43018: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25201 1726882688.43022: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882688.43030: getting variables 25201 1726882688.43031: in VariableManager get_vars() 25201 1726882688.43054: Calling all_inventory to load vars for managed_node2 25201 1726882688.43055: Calling groups_inventory to load vars for managed_node2 25201 1726882688.43057: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882688.43067: Calling all_plugins_play to load vars for managed_node2 25201 1726882688.43071: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882688.43073: Calling groups_plugins_play to load vars for managed_node2 25201 1726882688.43181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882688.43303: done with get_vars() 25201 1726882688.43310: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:08 -0400 (0:00:00.023) 0:00:09.608 ****** 25201 1726882688.43370: entering _queue_task() for managed_node2/service_facts 25201 1726882688.43372: Creating lock for service_facts 25201 1726882688.43527: worker is 1 (out of 1 available) 25201 1726882688.43539: exiting _queue_task() for managed_node2/service_facts 25201 1726882688.43551: done queuing things up, now waiting for results queue to drain 25201 1726882688.43552: waiting for pending results... 25201 1726882688.43713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 25201 1726882688.43805: in run() - task 0e448fcc-3ce9-313b-197e-000000000201 25201 1726882688.43815: variable 'ansible_search_path' from source: unknown 25201 1726882688.43818: variable 'ansible_search_path' from source: unknown 25201 1726882688.43845: calling self._execute() 25201 1726882688.43904: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.43907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.43916: variable 'omit' from source: magic vars 25201 1726882688.44158: variable 'ansible_distribution_major_version' from source: facts 25201 1726882688.44172: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882688.44177: variable 'omit' from source: magic vars 25201 1726882688.44223: variable 'omit' from source: magic vars 25201 1726882688.44244: variable 'omit' from source: magic vars 25201 1726882688.44280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882688.44307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882688.44323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882688.44336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882688.44345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882688.44373: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882688.44376: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.44379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.44446: Set connection var ansible_shell_executable to /bin/sh 25201 1726882688.44449: Set connection var ansible_pipelining to False 25201 1726882688.44455: Set connection var ansible_connection to ssh 25201 1726882688.44460: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882688.44462: Set connection var ansible_shell_type to sh 25201 1726882688.44472: Set connection var ansible_timeout to 10 25201 1726882688.44489: variable 'ansible_shell_executable' from source: unknown 25201 1726882688.44492: variable 'ansible_connection' from source: unknown 25201 1726882688.44495: variable 'ansible_module_compression' from source: unknown 25201 1726882688.44497: variable 'ansible_shell_type' from source: unknown 25201 1726882688.44499: variable 'ansible_shell_executable' from source: unknown 25201 1726882688.44501: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882688.44505: variable 'ansible_pipelining' from source: unknown 25201 1726882688.44507: variable 'ansible_timeout' from source: unknown 25201 1726882688.44509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882688.44644: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882688.44651: variable 'omit' from source: magic vars 25201 1726882688.44657: starting attempt loop 25201 1726882688.44659: running the handler 25201 1726882688.44675: _low_level_execute_command(): starting 25201 1726882688.44682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882688.45710: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.45714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.45744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882688.45748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.45751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.45823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.45826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.45832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.45940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.47584: stdout chunk (state=3): >>>/root <<< 25201 1726882688.47752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.47755: stdout chunk (state=3): >>><<< 25201 1726882688.47756: stderr chunk (state=3): >>><<< 25201 1726882688.47860: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.47869: _low_level_execute_command(): starting 25201 1726882688.47873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173 `" && echo ansible-tmp-1726882688.477799-25668-114222070603173="` echo /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173 `" ) && sleep 0' 25201 1726882688.48560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882688.48581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.48597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.48616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.48670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.48692: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882688.48707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.48727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882688.48740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.48753: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882688.48776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.48792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.48816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.48840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.48853: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882688.48876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.48959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.48991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.49007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.49138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.50999: stdout chunk (state=3): >>>ansible-tmp-1726882688.477799-25668-114222070603173=/root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173 <<< 25201 1726882688.51186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.51189: stdout chunk (state=3): >>><<< 25201 1726882688.51192: stderr chunk (state=3): >>><<< 25201 1726882688.51270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882688.477799-25668-114222070603173=/root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.51274: variable 'ansible_module_compression' from source: unknown 25201 1726882688.51370: ANSIBALLZ: Using lock for service_facts 25201 1726882688.51373: ANSIBALLZ: Acquiring lock 25201 1726882688.51376: ANSIBALLZ: Lock acquired: 140300036694400 25201 1726882688.51378: ANSIBALLZ: Creating module 25201 1726882688.67192: ANSIBALLZ: Writing module into payload 25201 1726882688.67322: ANSIBALLZ: Writing module 25201 1726882688.67353: ANSIBALLZ: Renaming module 25201 1726882688.67370: ANSIBALLZ: Done creating module 25201 1726882688.67396: variable 'ansible_facts' from source: unknown 25201 1726882688.67474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/AnsiballZ_service_facts.py 25201 1726882688.67651: Sending initial data 25201 1726882688.67654: Sent initial data (161 bytes) 25201 1726882688.68705: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882688.68719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.68735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.68753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.68803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.68821: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882688.68834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.68851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882688.68862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.68877: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882688.68888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.68900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.68916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.68931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.68941: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882688.68953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.69038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.69060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.69083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.69220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.71061: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882688.71156: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882688.71259: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp_cze8z_d /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/AnsiballZ_service_facts.py <<< 25201 1726882688.71350: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882688.72891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.73081: stderr chunk (state=3): >>><<< 25201 1726882688.73085: stdout chunk (state=3): >>><<< 25201 1726882688.73087: done transferring module to remote 25201 1726882688.73089: _low_level_execute_command(): starting 25201 1726882688.73091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/ /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/AnsiballZ_service_facts.py && sleep 0' 25201 1726882688.73693: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882688.73705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.73718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.73741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.73786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.73798: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882688.73810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.73826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882688.73842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.73852: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882688.73867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.73882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.73896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.73906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.73916: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882688.73928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.74012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.74032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.74046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.74188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882688.75979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882688.76022: stderr chunk (state=3): >>><<< 25201 1726882688.76025: stdout chunk (state=3): >>><<< 25201 1726882688.76070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882688.76073: _low_level_execute_command(): starting 25201 1726882688.76076: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/AnsiballZ_service_facts.py && sleep 0' 25201 1726882688.76685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882688.76699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.76714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.76731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.76776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.76789: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882688.76805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.76824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882688.76836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882688.76847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882688.76860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882688.76879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882688.76896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882688.76909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882688.76921: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882688.76936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882688.77012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882688.77034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882688.77051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882688.77197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.10606: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 25201 1726882690.10621: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 25201 1726882690.10647: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 25201 1726882690.10668: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25201 1726882690.11958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882690.11961: stdout chunk (state=3): >>><<< 25201 1726882690.11971: stderr chunk (state=3): >>><<< 25201 1726882690.11994: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882690.12585: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882690.12594: _low_level_execute_command(): starting 25201 1726882690.12600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882688.477799-25668-114222070603173/ > /dev/null 2>&1 && sleep 0' 25201 1726882690.13350: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882690.13368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.13390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.13410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.13454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.13473: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882690.13493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.13514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882690.13527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882690.13538: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882690.13551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.13569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.13586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.13601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.13618: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882690.13633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.13717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882690.13736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882690.13750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.13886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.15679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882690.15739: stderr chunk (state=3): >>><<< 25201 1726882690.15743: stdout chunk (state=3): >>><<< 25201 1726882690.15980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882690.15983: handler run complete 25201 1726882690.15985: variable 'ansible_facts' from source: unknown 25201 1726882690.16592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882690.17476: variable 'ansible_facts' from source: unknown 25201 1726882690.17787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882690.18134: attempt loop complete, returning result 25201 1726882690.18281: _execute() done 25201 1726882690.18288: dumping result to json 25201 1726882690.18348: done dumping result, returning 25201 1726882690.18386: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-313b-197e-000000000201] 25201 1726882690.18487: sending task result for task 0e448fcc-3ce9-313b-197e-000000000201 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882690.19906: no more pending results, returning what we have 25201 1726882690.19909: results queue empty 25201 1726882690.19911: checking for any_errors_fatal 25201 1726882690.19914: done checking for any_errors_fatal 25201 1726882690.19914: checking for max_fail_percentage 25201 1726882690.19916: done checking for max_fail_percentage 25201 1726882690.19917: checking to see if all hosts have failed and the running result is not ok 25201 1726882690.19918: done checking to see if all hosts have failed 25201 1726882690.19918: getting the remaining hosts for this loop 25201 1726882690.19920: done getting the remaining hosts for this loop 25201 1726882690.19923: getting the next task for host managed_node2 25201 1726882690.19930: done getting next task for host managed_node2 25201 1726882690.19933: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25201 1726882690.19937: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882690.19946: getting variables 25201 1726882690.19947: in VariableManager get_vars() 25201 1726882690.19984: Calling all_inventory to load vars for managed_node2 25201 1726882690.19987: Calling groups_inventory to load vars for managed_node2 25201 1726882690.19990: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882690.19999: Calling all_plugins_play to load vars for managed_node2 25201 1726882690.20002: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882690.20005: Calling groups_plugins_play to load vars for managed_node2 25201 1726882690.20362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882690.21155: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000201 25201 1726882690.21158: WORKER PROCESS EXITING 25201 1726882690.21231: done with get_vars() 25201 1726882690.21243: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:10 -0400 (0:00:01.780) 0:00:11.389 ****** 25201 1726882690.21448: entering _queue_task() for managed_node2/package_facts 25201 1726882690.21456: Creating lock for package_facts 25201 1726882690.21750: worker is 1 (out of 1 available) 25201 1726882690.21762: exiting _queue_task() for managed_node2/package_facts 25201 1726882690.21777: done queuing things up, now waiting for results queue to drain 25201 1726882690.21779: waiting for pending results... 25201 1726882690.22159: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25201 1726882690.22304: in run() - task 0e448fcc-3ce9-313b-197e-000000000202 25201 1726882690.22322: variable 'ansible_search_path' from source: unknown 25201 1726882690.22330: variable 'ansible_search_path' from source: unknown 25201 1726882690.22373: calling self._execute() 25201 1726882690.22454: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882690.22472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882690.22489: variable 'omit' from source: magic vars 25201 1726882690.22869: variable 'ansible_distribution_major_version' from source: facts 25201 1726882690.22880: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882690.22886: variable 'omit' from source: magic vars 25201 1726882690.22948: variable 'omit' from source: magic vars 25201 1726882690.22981: variable 'omit' from source: magic vars 25201 1726882690.23017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882690.23043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882690.23058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882690.23074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882690.23084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882690.23115: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882690.23118: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882690.23126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882690.23195: Set connection var ansible_shell_executable to /bin/sh 25201 1726882690.23199: Set connection var ansible_pipelining to False 25201 1726882690.23204: Set connection var ansible_connection to ssh 25201 1726882690.23209: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882690.23211: Set connection var ansible_shell_type to sh 25201 1726882690.23220: Set connection var ansible_timeout to 10 25201 1726882690.23239: variable 'ansible_shell_executable' from source: unknown 25201 1726882690.23242: variable 'ansible_connection' from source: unknown 25201 1726882690.23245: variable 'ansible_module_compression' from source: unknown 25201 1726882690.23247: variable 'ansible_shell_type' from source: unknown 25201 1726882690.23250: variable 'ansible_shell_executable' from source: unknown 25201 1726882690.23252: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882690.23254: variable 'ansible_pipelining' from source: unknown 25201 1726882690.23257: variable 'ansible_timeout' from source: unknown 25201 1726882690.23261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882690.23400: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882690.23408: variable 'omit' from source: magic vars 25201 1726882690.23414: starting attempt loop 25201 1726882690.23416: running the handler 25201 1726882690.23428: _low_level_execute_command(): starting 25201 1726882690.23435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882690.24177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.24266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.25922: stdout chunk (state=3): >>>/root <<< 25201 1726882690.26106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882690.26109: stdout chunk (state=3): >>><<< 25201 1726882690.26111: stderr chunk (state=3): >>><<< 25201 1726882690.26212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882690.26216: _low_level_execute_command(): starting 25201 1726882690.26219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435 `" && echo ansible-tmp-1726882690.2612932-25727-241766636808435="` echo /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435 `" ) && sleep 0' 25201 1726882690.27403: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.27407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.27446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.27449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.27458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.27521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882690.27525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.27639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.29531: stdout chunk (state=3): >>>ansible-tmp-1726882690.2612932-25727-241766636808435=/root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435 <<< 25201 1726882690.29974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882690.29977: stdout chunk (state=3): >>><<< 25201 1726882690.29980: stderr chunk (state=3): >>><<< 25201 1726882690.30061: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882690.2612932-25727-241766636808435=/root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882690.30070: variable 'ansible_module_compression' from source: unknown 25201 1726882690.30164: ANSIBALLZ: Using lock for package_facts 25201 1726882690.30168: ANSIBALLZ: Acquiring lock 25201 1726882690.30170: ANSIBALLZ: Lock acquired: 140300039962448 25201 1726882690.30173: ANSIBALLZ: Creating module 25201 1726882690.62485: ANSIBALLZ: Writing module into payload 25201 1726882690.62673: ANSIBALLZ: Writing module 25201 1726882690.62705: ANSIBALLZ: Renaming module 25201 1726882690.62715: ANSIBALLZ: Done creating module 25201 1726882690.62755: variable 'ansible_facts' from source: unknown 25201 1726882690.62918: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/AnsiballZ_package_facts.py 25201 1726882690.63229: Sending initial data 25201 1726882690.63232: Sent initial data (162 bytes) 25201 1726882690.65243: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.65247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.65270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.65284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.65360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882690.65366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882690.65374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.65494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.67328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882690.67423: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882690.67524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmph6dd0fzy /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/AnsiballZ_package_facts.py <<< 25201 1726882690.67618: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882690.70380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882690.70570: stderr chunk (state=3): >>><<< 25201 1726882690.70573: stdout chunk (state=3): >>><<< 25201 1726882690.70576: done transferring module to remote 25201 1726882690.70578: _low_level_execute_command(): starting 25201 1726882690.70580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/ /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/AnsiballZ_package_facts.py && sleep 0' 25201 1726882690.71175: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882690.71190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.71203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.71219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.71259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.71277: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882690.71291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.71308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882690.71318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882690.71327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882690.71338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.71350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.71372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.71386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.71401: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882690.71413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.71492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882690.71513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882690.71527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.71651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882690.73482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882690.73508: stderr chunk (state=3): >>><<< 25201 1726882690.73511: stdout chunk (state=3): >>><<< 25201 1726882690.73601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882690.73604: _low_level_execute_command(): starting 25201 1726882690.73608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/AnsiballZ_package_facts.py && sleep 0' 25201 1726882690.74159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882690.74178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.74192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.74207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.74245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.74255: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882690.74272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.74288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882690.74298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882690.74307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882690.74316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882690.74327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882690.74340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882690.74349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882690.74357: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882690.74372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882690.74444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882690.74467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882690.74482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882690.74613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882691.20475: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86<<< 25201 1726882691.20490: stdout chunk (state=3): >>>_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba"<<< 25201 1726882691.20540: stdout chunk (state=3): >>>, "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pyt<<< 25201 1726882691.20546: stdout chunk (state=3): >>>hon3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch":<<< 25201 1726882691.20554: stdout chunk (state=3): >>> 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9<<< 25201 1726882691.20557: stdout chunk (state=3): >>>.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": <<< 25201 1726882691.20598: stdout chunk (state=3): >>>"rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysini<<< 25201 1726882691.20606: stdout chunk (state=3): >>>t", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "per<<< 25201 1726882691.20624: stdout chunk (state=3): >>>l-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8<<< 25201 1726882691.20645: stdout chunk (state=3): >>>.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch":<<< 25201 1726882691.20661: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "releas<<< 25201 1726882691.20686: stdout chunk (state=3): >>>e": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25201 1726882691.22107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882691.22154: stderr chunk (state=3): >>><<< 25201 1726882691.22158: stdout chunk (state=3): >>><<< 25201 1726882691.22199: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882691.24190: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882691.24193: _low_level_execute_command(): starting 25201 1726882691.24196: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882690.2612932-25727-241766636808435/ > /dev/null 2>&1 && sleep 0' 25201 1726882691.24810: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882691.24825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882691.24846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882691.24866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882691.24910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882691.24924: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882691.24939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882691.24967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882691.24980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882691.24990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882691.25005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882691.25018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882691.25036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882691.25048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882691.25067: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882691.25083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882691.25177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882691.25198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882691.25213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882691.25341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882691.27178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882691.27220: stderr chunk (state=3): >>><<< 25201 1726882691.27223: stdout chunk (state=3): >>><<< 25201 1726882691.27236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882691.27242: handler run complete 25201 1726882691.27734: variable 'ansible_facts' from source: unknown 25201 1726882691.28010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.29908: variable 'ansible_facts' from source: unknown 25201 1726882691.30182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.30625: attempt loop complete, returning result 25201 1726882691.30636: _execute() done 25201 1726882691.30639: dumping result to json 25201 1726882691.30769: done dumping result, returning 25201 1726882691.30778: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-313b-197e-000000000202] 25201 1726882691.30783: sending task result for task 0e448fcc-3ce9-313b-197e-000000000202 25201 1726882691.32050: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000202 25201 1726882691.32053: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882691.32095: no more pending results, returning what we have 25201 1726882691.32097: results queue empty 25201 1726882691.32098: checking for any_errors_fatal 25201 1726882691.32102: done checking for any_errors_fatal 25201 1726882691.32102: checking for max_fail_percentage 25201 1726882691.32103: done checking for max_fail_percentage 25201 1726882691.32104: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.32104: done checking to see if all hosts have failed 25201 1726882691.32105: getting the remaining hosts for this loop 25201 1726882691.32106: done getting the remaining hosts for this loop 25201 1726882691.32110: getting the next task for host managed_node2 25201 1726882691.32116: done getting next task for host managed_node2 25201 1726882691.32119: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25201 1726882691.32121: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.32127: getting variables 25201 1726882691.32128: in VariableManager get_vars() 25201 1726882691.32153: Calling all_inventory to load vars for managed_node2 25201 1726882691.32154: Calling groups_inventory to load vars for managed_node2 25201 1726882691.32156: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.32163: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.32167: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.32169: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.32923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.33853: done with get_vars() 25201 1726882691.33876: done getting variables 25201 1726882691.33918: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:11 -0400 (0:00:01.124) 0:00:12.514 ****** 25201 1726882691.33942: entering _queue_task() for managed_node2/debug 25201 1726882691.34154: worker is 1 (out of 1 available) 25201 1726882691.34171: exiting _queue_task() for managed_node2/debug 25201 1726882691.34182: done queuing things up, now waiting for results queue to drain 25201 1726882691.34184: waiting for pending results... 25201 1726882691.34345: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 25201 1726882691.34429: in run() - task 0e448fcc-3ce9-313b-197e-000000000018 25201 1726882691.34441: variable 'ansible_search_path' from source: unknown 25201 1726882691.34444: variable 'ansible_search_path' from source: unknown 25201 1726882691.34473: calling self._execute() 25201 1726882691.34535: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.34539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.34547: variable 'omit' from source: magic vars 25201 1726882691.34881: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.34899: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.34910: variable 'omit' from source: magic vars 25201 1726882691.34973: variable 'omit' from source: magic vars 25201 1726882691.35078: variable 'network_provider' from source: set_fact 25201 1726882691.35091: variable 'omit' from source: magic vars 25201 1726882691.35122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882691.35147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882691.35174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882691.35188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882691.35196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882691.35219: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882691.35222: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.35225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.35302: Set connection var ansible_shell_executable to /bin/sh 25201 1726882691.35305: Set connection var ansible_pipelining to False 25201 1726882691.35311: Set connection var ansible_connection to ssh 25201 1726882691.35316: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882691.35318: Set connection var ansible_shell_type to sh 25201 1726882691.35325: Set connection var ansible_timeout to 10 25201 1726882691.35340: variable 'ansible_shell_executable' from source: unknown 25201 1726882691.35343: variable 'ansible_connection' from source: unknown 25201 1726882691.35346: variable 'ansible_module_compression' from source: unknown 25201 1726882691.35348: variable 'ansible_shell_type' from source: unknown 25201 1726882691.35350: variable 'ansible_shell_executable' from source: unknown 25201 1726882691.35353: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.35355: variable 'ansible_pipelining' from source: unknown 25201 1726882691.35358: variable 'ansible_timeout' from source: unknown 25201 1726882691.35362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.35460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882691.35473: variable 'omit' from source: magic vars 25201 1726882691.35477: starting attempt loop 25201 1726882691.35480: running the handler 25201 1726882691.35517: handler run complete 25201 1726882691.35527: attempt loop complete, returning result 25201 1726882691.35530: _execute() done 25201 1726882691.35533: dumping result to json 25201 1726882691.35535: done dumping result, returning 25201 1726882691.35542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-313b-197e-000000000018] 25201 1726882691.35547: sending task result for task 0e448fcc-3ce9-313b-197e-000000000018 25201 1726882691.35626: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000018 25201 1726882691.35629: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 25201 1726882691.35691: no more pending results, returning what we have 25201 1726882691.35694: results queue empty 25201 1726882691.35695: checking for any_errors_fatal 25201 1726882691.35703: done checking for any_errors_fatal 25201 1726882691.35704: checking for max_fail_percentage 25201 1726882691.35706: done checking for max_fail_percentage 25201 1726882691.35707: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.35707: done checking to see if all hosts have failed 25201 1726882691.35708: getting the remaining hosts for this loop 25201 1726882691.35710: done getting the remaining hosts for this loop 25201 1726882691.35713: getting the next task for host managed_node2 25201 1726882691.35719: done getting next task for host managed_node2 25201 1726882691.35722: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25201 1726882691.35725: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.35734: getting variables 25201 1726882691.35735: in VariableManager get_vars() 25201 1726882691.35776: Calling all_inventory to load vars for managed_node2 25201 1726882691.35779: Calling groups_inventory to load vars for managed_node2 25201 1726882691.35781: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.35789: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.35791: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.35794: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.36539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.37468: done with get_vars() 25201 1726882691.37486: done getting variables 25201 1726882691.37524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:11 -0400 (0:00:00.036) 0:00:12.550 ****** 25201 1726882691.37547: entering _queue_task() for managed_node2/fail 25201 1726882691.37729: worker is 1 (out of 1 available) 25201 1726882691.37743: exiting _queue_task() for managed_node2/fail 25201 1726882691.37754: done queuing things up, now waiting for results queue to drain 25201 1726882691.37756: waiting for pending results... 25201 1726882691.37920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25201 1726882691.38000: in run() - task 0e448fcc-3ce9-313b-197e-000000000019 25201 1726882691.38010: variable 'ansible_search_path' from source: unknown 25201 1726882691.38014: variable 'ansible_search_path' from source: unknown 25201 1726882691.38042: calling self._execute() 25201 1726882691.38114: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.38118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.38126: variable 'omit' from source: magic vars 25201 1726882691.38388: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.38397: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.38481: variable 'network_state' from source: role '' defaults 25201 1726882691.38489: Evaluated conditional (network_state != {}): False 25201 1726882691.38493: when evaluation is False, skipping this task 25201 1726882691.38495: _execute() done 25201 1726882691.38498: dumping result to json 25201 1726882691.38500: done dumping result, returning 25201 1726882691.38506: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-313b-197e-000000000019] 25201 1726882691.38511: sending task result for task 0e448fcc-3ce9-313b-197e-000000000019 25201 1726882691.38593: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000019 25201 1726882691.38597: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882691.38661: no more pending results, returning what we have 25201 1726882691.38666: results queue empty 25201 1726882691.38667: checking for any_errors_fatal 25201 1726882691.38673: done checking for any_errors_fatal 25201 1726882691.38674: checking for max_fail_percentage 25201 1726882691.38675: done checking for max_fail_percentage 25201 1726882691.38676: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.38677: done checking to see if all hosts have failed 25201 1726882691.38677: getting the remaining hosts for this loop 25201 1726882691.38679: done getting the remaining hosts for this loop 25201 1726882691.38681: getting the next task for host managed_node2 25201 1726882691.38687: done getting next task for host managed_node2 25201 1726882691.38690: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25201 1726882691.38693: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.38703: getting variables 25201 1726882691.38704: in VariableManager get_vars() 25201 1726882691.38733: Calling all_inventory to load vars for managed_node2 25201 1726882691.38735: Calling groups_inventory to load vars for managed_node2 25201 1726882691.38737: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.38743: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.38744: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.38746: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.39551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.40469: done with get_vars() 25201 1726882691.40483: done getting variables 25201 1726882691.40521: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:11 -0400 (0:00:00.029) 0:00:12.580 ****** 25201 1726882691.40542: entering _queue_task() for managed_node2/fail 25201 1726882691.40719: worker is 1 (out of 1 available) 25201 1726882691.40732: exiting _queue_task() for managed_node2/fail 25201 1726882691.40744: done queuing things up, now waiting for results queue to drain 25201 1726882691.40745: waiting for pending results... 25201 1726882691.40901: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25201 1726882691.40978: in run() - task 0e448fcc-3ce9-313b-197e-00000000001a 25201 1726882691.40991: variable 'ansible_search_path' from source: unknown 25201 1726882691.40995: variable 'ansible_search_path' from source: unknown 25201 1726882691.41020: calling self._execute() 25201 1726882691.41081: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.41087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.41098: variable 'omit' from source: magic vars 25201 1726882691.41347: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.41357: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.41441: variable 'network_state' from source: role '' defaults 25201 1726882691.41450: Evaluated conditional (network_state != {}): False 25201 1726882691.41453: when evaluation is False, skipping this task 25201 1726882691.41456: _execute() done 25201 1726882691.41458: dumping result to json 25201 1726882691.41460: done dumping result, returning 25201 1726882691.41470: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-313b-197e-00000000001a] 25201 1726882691.41476: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001a 25201 1726882691.41557: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001a 25201 1726882691.41560: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882691.41609: no more pending results, returning what we have 25201 1726882691.41612: results queue empty 25201 1726882691.41613: checking for any_errors_fatal 25201 1726882691.41618: done checking for any_errors_fatal 25201 1726882691.41619: checking for max_fail_percentage 25201 1726882691.41620: done checking for max_fail_percentage 25201 1726882691.41621: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.41622: done checking to see if all hosts have failed 25201 1726882691.41622: getting the remaining hosts for this loop 25201 1726882691.41624: done getting the remaining hosts for this loop 25201 1726882691.41626: getting the next task for host managed_node2 25201 1726882691.41632: done getting next task for host managed_node2 25201 1726882691.41635: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25201 1726882691.41637: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.41650: getting variables 25201 1726882691.41652: in VariableManager get_vars() 25201 1726882691.41688: Calling all_inventory to load vars for managed_node2 25201 1726882691.41690: Calling groups_inventory to load vars for managed_node2 25201 1726882691.41691: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.41697: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.41699: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.41701: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.42440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.43449: done with get_vars() 25201 1726882691.43462: done getting variables 25201 1726882691.43504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:11 -0400 (0:00:00.029) 0:00:12.609 ****** 25201 1726882691.43528: entering _queue_task() for managed_node2/fail 25201 1726882691.43705: worker is 1 (out of 1 available) 25201 1726882691.43719: exiting _queue_task() for managed_node2/fail 25201 1726882691.43730: done queuing things up, now waiting for results queue to drain 25201 1726882691.43731: waiting for pending results... 25201 1726882691.43891: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25201 1726882691.43965: in run() - task 0e448fcc-3ce9-313b-197e-00000000001b 25201 1726882691.43979: variable 'ansible_search_path' from source: unknown 25201 1726882691.43983: variable 'ansible_search_path' from source: unknown 25201 1726882691.44008: calling self._execute() 25201 1726882691.44077: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.44080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.44087: variable 'omit' from source: magic vars 25201 1726882691.44336: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.44346: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.44463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882691.46000: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882691.46051: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882691.46081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882691.46107: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882691.46129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882691.46186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.46205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.46224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.46252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.46263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.46330: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.46342: Evaluated conditional (ansible_distribution_major_version | int > 9): False 25201 1726882691.46347: when evaluation is False, skipping this task 25201 1726882691.46350: _execute() done 25201 1726882691.46353: dumping result to json 25201 1726882691.46355: done dumping result, returning 25201 1726882691.46360: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-313b-197e-00000000001b] 25201 1726882691.46371: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001b 25201 1726882691.46445: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001b 25201 1726882691.46448: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 25201 1726882691.46516: no more pending results, returning what we have 25201 1726882691.46519: results queue empty 25201 1726882691.46521: checking for any_errors_fatal 25201 1726882691.46525: done checking for any_errors_fatal 25201 1726882691.46525: checking for max_fail_percentage 25201 1726882691.46527: done checking for max_fail_percentage 25201 1726882691.46528: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.46529: done checking to see if all hosts have failed 25201 1726882691.46529: getting the remaining hosts for this loop 25201 1726882691.46531: done getting the remaining hosts for this loop 25201 1726882691.46534: getting the next task for host managed_node2 25201 1726882691.46539: done getting next task for host managed_node2 25201 1726882691.46542: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25201 1726882691.46545: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.46557: getting variables 25201 1726882691.46558: in VariableManager get_vars() 25201 1726882691.46602: Calling all_inventory to load vars for managed_node2 25201 1726882691.46605: Calling groups_inventory to load vars for managed_node2 25201 1726882691.46607: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.46614: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.46616: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.46617: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.47367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.48300: done with get_vars() 25201 1726882691.48314: done getting variables 25201 1726882691.48381: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:11 -0400 (0:00:00.048) 0:00:12.658 ****** 25201 1726882691.48403: entering _queue_task() for managed_node2/dnf 25201 1726882691.48579: worker is 1 (out of 1 available) 25201 1726882691.48592: exiting _queue_task() for managed_node2/dnf 25201 1726882691.48602: done queuing things up, now waiting for results queue to drain 25201 1726882691.48604: waiting for pending results... 25201 1726882691.48754: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25201 1726882691.48831: in run() - task 0e448fcc-3ce9-313b-197e-00000000001c 25201 1726882691.48842: variable 'ansible_search_path' from source: unknown 25201 1726882691.48849: variable 'ansible_search_path' from source: unknown 25201 1726882691.48881: calling self._execute() 25201 1726882691.48943: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.48947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.48960: variable 'omit' from source: magic vars 25201 1726882691.49211: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.49220: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.49349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882691.51049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882691.51092: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882691.51121: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882691.51155: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882691.51176: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882691.51231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.51250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.51268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.51295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.51305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.51379: variable 'ansible_distribution' from source: facts 25201 1726882691.51383: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.51393: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25201 1726882691.51469: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882691.51547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.51569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.51585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.51610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.51620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.51648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.51669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.51685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.51709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.51719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.51744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.51765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.51783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.51807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.51817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.51912: variable 'network_connections' from source: task vars 25201 1726882691.51921: variable 'interface' from source: play vars 25201 1726882691.51970: variable 'interface' from source: play vars 25201 1726882691.52018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882691.52127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882691.52152: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882691.52176: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882691.52196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882691.52232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882691.52247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882691.52272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.52291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882691.52334: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882691.52494: variable 'network_connections' from source: task vars 25201 1726882691.52497: variable 'interface' from source: play vars 25201 1726882691.52542: variable 'interface' from source: play vars 25201 1726882691.52567: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882691.52572: when evaluation is False, skipping this task 25201 1726882691.52575: _execute() done 25201 1726882691.52577: dumping result to json 25201 1726882691.52581: done dumping result, returning 25201 1726882691.52587: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-00000000001c] 25201 1726882691.52592: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001c 25201 1726882691.52680: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001c 25201 1726882691.52683: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882691.52726: no more pending results, returning what we have 25201 1726882691.52730: results queue empty 25201 1726882691.52731: checking for any_errors_fatal 25201 1726882691.52741: done checking for any_errors_fatal 25201 1726882691.52742: checking for max_fail_percentage 25201 1726882691.52744: done checking for max_fail_percentage 25201 1726882691.52744: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.52745: done checking to see if all hosts have failed 25201 1726882691.52749: getting the remaining hosts for this loop 25201 1726882691.52751: done getting the remaining hosts for this loop 25201 1726882691.52754: getting the next task for host managed_node2 25201 1726882691.52760: done getting next task for host managed_node2 25201 1726882691.52766: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25201 1726882691.52769: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.52781: getting variables 25201 1726882691.52782: in VariableManager get_vars() 25201 1726882691.52815: Calling all_inventory to load vars for managed_node2 25201 1726882691.52817: Calling groups_inventory to load vars for managed_node2 25201 1726882691.52819: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.52827: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.52829: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.52832: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.53655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.54571: done with get_vars() 25201 1726882691.54586: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25201 1726882691.54634: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:11 -0400 (0:00:00.062) 0:00:12.721 ****** 25201 1726882691.54654: entering _queue_task() for managed_node2/yum 25201 1726882691.54656: Creating lock for yum 25201 1726882691.54898: worker is 1 (out of 1 available) 25201 1726882691.54911: exiting _queue_task() for managed_node2/yum 25201 1726882691.54922: done queuing things up, now waiting for results queue to drain 25201 1726882691.54923: waiting for pending results... 25201 1726882691.55075: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25201 1726882691.55155: in run() - task 0e448fcc-3ce9-313b-197e-00000000001d 25201 1726882691.55168: variable 'ansible_search_path' from source: unknown 25201 1726882691.55172: variable 'ansible_search_path' from source: unknown 25201 1726882691.55200: calling self._execute() 25201 1726882691.55261: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.55270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.55278: variable 'omit' from source: magic vars 25201 1726882691.55525: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.55535: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.55652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882691.57598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882691.57677: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882691.57717: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882691.57755: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882691.57790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882691.57873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.57898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.57915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.57950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.57982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.58047: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.58059: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25201 1726882691.58061: when evaluation is False, skipping this task 25201 1726882691.58066: _execute() done 25201 1726882691.58072: dumping result to json 25201 1726882691.58075: done dumping result, returning 25201 1726882691.58081: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-00000000001d] 25201 1726882691.58086: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001d 25201 1726882691.58186: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001d 25201 1726882691.58188: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25201 1726882691.58236: no more pending results, returning what we have 25201 1726882691.58240: results queue empty 25201 1726882691.58241: checking for any_errors_fatal 25201 1726882691.58246: done checking for any_errors_fatal 25201 1726882691.58247: checking for max_fail_percentage 25201 1726882691.58249: done checking for max_fail_percentage 25201 1726882691.58250: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.58251: done checking to see if all hosts have failed 25201 1726882691.58251: getting the remaining hosts for this loop 25201 1726882691.58253: done getting the remaining hosts for this loop 25201 1726882691.58256: getting the next task for host managed_node2 25201 1726882691.58262: done getting next task for host managed_node2 25201 1726882691.58267: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25201 1726882691.58270: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.58282: getting variables 25201 1726882691.58284: in VariableManager get_vars() 25201 1726882691.58317: Calling all_inventory to load vars for managed_node2 25201 1726882691.58319: Calling groups_inventory to load vars for managed_node2 25201 1726882691.58321: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.58329: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.58332: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.58334: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.59096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.60426: done with get_vars() 25201 1726882691.60456: done getting variables 25201 1726882691.60514: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:11 -0400 (0:00:00.058) 0:00:12.780 ****** 25201 1726882691.60546: entering _queue_task() for managed_node2/fail 25201 1726882691.60784: worker is 1 (out of 1 available) 25201 1726882691.60798: exiting _queue_task() for managed_node2/fail 25201 1726882691.60807: done queuing things up, now waiting for results queue to drain 25201 1726882691.60809: waiting for pending results... 25201 1726882691.61065: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25201 1726882691.61191: in run() - task 0e448fcc-3ce9-313b-197e-00000000001e 25201 1726882691.61209: variable 'ansible_search_path' from source: unknown 25201 1726882691.61216: variable 'ansible_search_path' from source: unknown 25201 1726882691.61255: calling self._execute() 25201 1726882691.61336: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.61347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.61366: variable 'omit' from source: magic vars 25201 1726882691.61706: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.61722: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.61840: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882691.62039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882691.64321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882691.64392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882691.64427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882691.64461: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882691.64498: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882691.64580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.64620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.64651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.64700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.64726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.64777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.64808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.64842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.64890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.64910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.64957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.64988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.65017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.65067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.65090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.65362: variable 'network_connections' from source: task vars 25201 1726882691.65384: variable 'interface' from source: play vars 25201 1726882691.65457: variable 'interface' from source: play vars 25201 1726882691.65539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882691.65822: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882691.65865: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882691.65985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882691.66020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882691.66069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882691.66097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882691.66128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.66161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882691.66222: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882691.66484: variable 'network_connections' from source: task vars 25201 1726882691.66495: variable 'interface' from source: play vars 25201 1726882691.66557: variable 'interface' from source: play vars 25201 1726882691.66598: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882691.66607: when evaluation is False, skipping this task 25201 1726882691.66615: _execute() done 25201 1726882691.66621: dumping result to json 25201 1726882691.66629: done dumping result, returning 25201 1726882691.66640: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-00000000001e] 25201 1726882691.66650: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001e 25201 1726882691.66773: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001e 25201 1726882691.66782: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882691.66836: no more pending results, returning what we have 25201 1726882691.66840: results queue empty 25201 1726882691.66841: checking for any_errors_fatal 25201 1726882691.66849: done checking for any_errors_fatal 25201 1726882691.66850: checking for max_fail_percentage 25201 1726882691.66852: done checking for max_fail_percentage 25201 1726882691.66852: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.66853: done checking to see if all hosts have failed 25201 1726882691.66854: getting the remaining hosts for this loop 25201 1726882691.66856: done getting the remaining hosts for this loop 25201 1726882691.66859: getting the next task for host managed_node2 25201 1726882691.66870: done getting next task for host managed_node2 25201 1726882691.66874: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25201 1726882691.66877: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.66890: getting variables 25201 1726882691.66892: in VariableManager get_vars() 25201 1726882691.66931: Calling all_inventory to load vars for managed_node2 25201 1726882691.66934: Calling groups_inventory to load vars for managed_node2 25201 1726882691.66936: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.66945: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.66947: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.66950: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.69576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882691.72195: done with get_vars() 25201 1726882691.72226: done getting variables 25201 1726882691.72291: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:11 -0400 (0:00:00.117) 0:00:12.897 ****** 25201 1726882691.72327: entering _queue_task() for managed_node2/package 25201 1726882691.72619: worker is 1 (out of 1 available) 25201 1726882691.72631: exiting _queue_task() for managed_node2/package 25201 1726882691.72643: done queuing things up, now waiting for results queue to drain 25201 1726882691.72645: waiting for pending results... 25201 1726882691.72989: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 25201 1726882691.73120: in run() - task 0e448fcc-3ce9-313b-197e-00000000001f 25201 1726882691.73138: variable 'ansible_search_path' from source: unknown 25201 1726882691.73148: variable 'ansible_search_path' from source: unknown 25201 1726882691.73189: calling self._execute() 25201 1726882691.73282: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882691.73292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882691.73307: variable 'omit' from source: magic vars 25201 1726882691.73658: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.73679: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882691.73875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882691.74131: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882691.74180: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882691.74222: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882691.74260: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882691.74375: variable 'network_packages' from source: role '' defaults 25201 1726882691.74483: variable '__network_provider_setup' from source: role '' defaults 25201 1726882691.74497: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882691.74573: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882691.74588: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882691.74653: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882691.74827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882691.80942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882691.81007: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882691.81067: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882691.81105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882691.81141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882691.81217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.81255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.81288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.81331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.81355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.81408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.81439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.81477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.81519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.81536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.81771: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25201 1726882691.81895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.81922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.81948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.81997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.82018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.82118: variable 'ansible_python' from source: facts 25201 1726882691.82143: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25201 1726882691.82232: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882691.82319: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882691.82447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.82478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.82506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.82551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.82572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.82617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882691.82675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882691.82704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.82749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882691.82780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882691.82922: variable 'network_connections' from source: task vars 25201 1726882691.82932: variable 'interface' from source: play vars 25201 1726882691.83035: variable 'interface' from source: play vars 25201 1726882691.83108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882691.83137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882691.83172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882691.83210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882691.83248: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882691.83769: variable 'network_connections' from source: task vars 25201 1726882691.83839: variable 'interface' from source: play vars 25201 1726882691.84056: variable 'interface' from source: play vars 25201 1726882691.84116: variable '__network_packages_default_wireless' from source: role '' defaults 25201 1726882691.84247: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882691.84987: variable 'network_connections' from source: task vars 25201 1726882691.84997: variable 'interface' from source: play vars 25201 1726882691.85068: variable 'interface' from source: play vars 25201 1726882691.85100: variable '__network_packages_default_team' from source: role '' defaults 25201 1726882691.85193: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882691.85513: variable 'network_connections' from source: task vars 25201 1726882691.85523: variable 'interface' from source: play vars 25201 1726882691.85596: variable 'interface' from source: play vars 25201 1726882691.85658: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882691.85726: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882691.85738: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882691.85805: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882691.86039: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25201 1726882691.86479: variable 'network_connections' from source: task vars 25201 1726882691.86488: variable 'interface' from source: play vars 25201 1726882691.86554: variable 'interface' from source: play vars 25201 1726882691.86571: variable 'ansible_distribution' from source: facts 25201 1726882691.86580: variable '__network_rh_distros' from source: role '' defaults 25201 1726882691.86590: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.86614: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25201 1726882691.86908: variable 'ansible_distribution' from source: facts 25201 1726882691.86916: variable '__network_rh_distros' from source: role '' defaults 25201 1726882691.86924: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.86938: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25201 1726882691.87119: variable 'ansible_distribution' from source: facts 25201 1726882691.87127: variable '__network_rh_distros' from source: role '' defaults 25201 1726882691.87135: variable 'ansible_distribution_major_version' from source: facts 25201 1726882691.87175: variable 'network_provider' from source: set_fact 25201 1726882691.87195: variable 'ansible_facts' from source: unknown 25201 1726882691.87886: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25201 1726882691.87893: when evaluation is False, skipping this task 25201 1726882691.87899: _execute() done 25201 1726882691.87904: dumping result to json 25201 1726882691.87910: done dumping result, returning 25201 1726882691.87921: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-313b-197e-00000000001f] 25201 1726882691.87929: sending task result for task 0e448fcc-3ce9-313b-197e-00000000001f skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25201 1726882691.88071: no more pending results, returning what we have 25201 1726882691.88074: results queue empty 25201 1726882691.88075: checking for any_errors_fatal 25201 1726882691.88082: done checking for any_errors_fatal 25201 1726882691.88083: checking for max_fail_percentage 25201 1726882691.88084: done checking for max_fail_percentage 25201 1726882691.88085: checking to see if all hosts have failed and the running result is not ok 25201 1726882691.88086: done checking to see if all hosts have failed 25201 1726882691.88087: getting the remaining hosts for this loop 25201 1726882691.88088: done getting the remaining hosts for this loop 25201 1726882691.88092: getting the next task for host managed_node2 25201 1726882691.88099: done getting next task for host managed_node2 25201 1726882691.88103: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25201 1726882691.88106: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882691.88120: getting variables 25201 1726882691.88121: in VariableManager get_vars() 25201 1726882691.88161: Calling all_inventory to load vars for managed_node2 25201 1726882691.88165: Calling groups_inventory to load vars for managed_node2 25201 1726882691.88168: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882691.88178: Calling all_plugins_play to load vars for managed_node2 25201 1726882691.88180: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882691.88183: Calling groups_plugins_play to load vars for managed_node2 25201 1726882691.89185: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000001f 25201 1726882691.89189: WORKER PROCESS EXITING 25201 1726882691.98671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882692.01848: done with get_vars() 25201 1726882692.01882: done getting variables 25201 1726882692.01932: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:12 -0400 (0:00:00.296) 0:00:13.194 ****** 25201 1726882692.01961: entering _queue_task() for managed_node2/package 25201 1726882692.02979: worker is 1 (out of 1 available) 25201 1726882692.02994: exiting _queue_task() for managed_node2/package 25201 1726882692.03006: done queuing things up, now waiting for results queue to drain 25201 1726882692.03008: waiting for pending results... 25201 1726882692.03448: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25201 1726882692.03715: in run() - task 0e448fcc-3ce9-313b-197e-000000000020 25201 1726882692.03822: variable 'ansible_search_path' from source: unknown 25201 1726882692.03830: variable 'ansible_search_path' from source: unknown 25201 1726882692.03873: calling self._execute() 25201 1726882692.04078: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.04091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.04106: variable 'omit' from source: magic vars 25201 1726882692.04810: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.04915: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882692.05153: variable 'network_state' from source: role '' defaults 25201 1726882692.05173: Evaluated conditional (network_state != {}): False 25201 1726882692.05182: when evaluation is False, skipping this task 25201 1726882692.05189: _execute() done 25201 1726882692.05196: dumping result to json 25201 1726882692.05203: done dumping result, returning 25201 1726882692.05213: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-313b-197e-000000000020] 25201 1726882692.05344: sending task result for task 0e448fcc-3ce9-313b-197e-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882692.05507: no more pending results, returning what we have 25201 1726882692.05511: results queue empty 25201 1726882692.05512: checking for any_errors_fatal 25201 1726882692.05519: done checking for any_errors_fatal 25201 1726882692.05520: checking for max_fail_percentage 25201 1726882692.05522: done checking for max_fail_percentage 25201 1726882692.05523: checking to see if all hosts have failed and the running result is not ok 25201 1726882692.05523: done checking to see if all hosts have failed 25201 1726882692.05524: getting the remaining hosts for this loop 25201 1726882692.05526: done getting the remaining hosts for this loop 25201 1726882692.05530: getting the next task for host managed_node2 25201 1726882692.05537: done getting next task for host managed_node2 25201 1726882692.05540: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25201 1726882692.05543: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882692.05565: getting variables 25201 1726882692.05567: in VariableManager get_vars() 25201 1726882692.05608: Calling all_inventory to load vars for managed_node2 25201 1726882692.05611: Calling groups_inventory to load vars for managed_node2 25201 1726882692.05614: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882692.05626: Calling all_plugins_play to load vars for managed_node2 25201 1726882692.05629: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882692.05632: Calling groups_plugins_play to load vars for managed_node2 25201 1726882692.06202: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000020 25201 1726882692.06205: WORKER PROCESS EXITING 25201 1726882692.08159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882692.10286: done with get_vars() 25201 1726882692.10307: done getting variables 25201 1726882692.10369: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:12 -0400 (0:00:00.084) 0:00:13.278 ****** 25201 1726882692.10401: entering _queue_task() for managed_node2/package 25201 1726882692.10705: worker is 1 (out of 1 available) 25201 1726882692.10718: exiting _queue_task() for managed_node2/package 25201 1726882692.10730: done queuing things up, now waiting for results queue to drain 25201 1726882692.10732: waiting for pending results... 25201 1726882692.11013: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25201 1726882692.11159: in run() - task 0e448fcc-3ce9-313b-197e-000000000021 25201 1726882692.11182: variable 'ansible_search_path' from source: unknown 25201 1726882692.11190: variable 'ansible_search_path' from source: unknown 25201 1726882692.11226: calling self._execute() 25201 1726882692.11325: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.11336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.11355: variable 'omit' from source: magic vars 25201 1726882692.11725: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.11743: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882692.11873: variable 'network_state' from source: role '' defaults 25201 1726882692.11894: Evaluated conditional (network_state != {}): False 25201 1726882692.11902: when evaluation is False, skipping this task 25201 1726882692.11909: _execute() done 25201 1726882692.11915: dumping result to json 25201 1726882692.11953: done dumping result, returning 25201 1726882692.11967: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-313b-197e-000000000021] 25201 1726882692.11979: sending task result for task 0e448fcc-3ce9-313b-197e-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882692.12208: no more pending results, returning what we have 25201 1726882692.12212: results queue empty 25201 1726882692.12213: checking for any_errors_fatal 25201 1726882692.12220: done checking for any_errors_fatal 25201 1726882692.12221: checking for max_fail_percentage 25201 1726882692.12223: done checking for max_fail_percentage 25201 1726882692.12224: checking to see if all hosts have failed and the running result is not ok 25201 1726882692.12225: done checking to see if all hosts have failed 25201 1726882692.12226: getting the remaining hosts for this loop 25201 1726882692.12227: done getting the remaining hosts for this loop 25201 1726882692.12231: getting the next task for host managed_node2 25201 1726882692.12239: done getting next task for host managed_node2 25201 1726882692.12243: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25201 1726882692.12246: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882692.12261: getting variables 25201 1726882692.12265: in VariableManager get_vars() 25201 1726882692.12305: Calling all_inventory to load vars for managed_node2 25201 1726882692.12308: Calling groups_inventory to load vars for managed_node2 25201 1726882692.12311: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882692.12323: Calling all_plugins_play to load vars for managed_node2 25201 1726882692.12326: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882692.12330: Calling groups_plugins_play to load vars for managed_node2 25201 1726882692.13473: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000021 25201 1726882692.13477: WORKER PROCESS EXITING 25201 1726882692.14429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882692.15491: done with get_vars() 25201 1726882692.15508: done getting variables 25201 1726882692.15580: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:12 -0400 (0:00:00.052) 0:00:13.330 ****** 25201 1726882692.15602: entering _queue_task() for managed_node2/service 25201 1726882692.15604: Creating lock for service 25201 1726882692.15801: worker is 1 (out of 1 available) 25201 1726882692.15814: exiting _queue_task() for managed_node2/service 25201 1726882692.15827: done queuing things up, now waiting for results queue to drain 25201 1726882692.15828: waiting for pending results... 25201 1726882692.16012: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25201 1726882692.16161: in run() - task 0e448fcc-3ce9-313b-197e-000000000022 25201 1726882692.16184: variable 'ansible_search_path' from source: unknown 25201 1726882692.16193: variable 'ansible_search_path' from source: unknown 25201 1726882692.16232: calling self._execute() 25201 1726882692.16325: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.16337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.16352: variable 'omit' from source: magic vars 25201 1726882692.16978: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.16996: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882692.17131: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882692.17342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882692.19516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882692.19800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882692.19827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882692.19850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882692.19874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882692.19933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.19952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.19975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.20005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.20016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.20046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.20063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.20085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.20113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.20123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.20150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.20169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.20187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.20219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.20229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.20342: variable 'network_connections' from source: task vars 25201 1726882692.20351: variable 'interface' from source: play vars 25201 1726882692.20406: variable 'interface' from source: play vars 25201 1726882692.20454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882692.20566: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882692.20612: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882692.20640: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882692.20662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882692.20744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882692.20760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882692.20804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.20833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882692.20930: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882692.21200: variable 'network_connections' from source: task vars 25201 1726882692.21211: variable 'interface' from source: play vars 25201 1726882692.21300: variable 'interface' from source: play vars 25201 1726882692.21344: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882692.21353: when evaluation is False, skipping this task 25201 1726882692.21359: _execute() done 25201 1726882692.21367: dumping result to json 25201 1726882692.21375: done dumping result, returning 25201 1726882692.22183: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-000000000022] 25201 1726882692.22193: sending task result for task 0e448fcc-3ce9-313b-197e-000000000022 25201 1726882692.22761: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000022 25201 1726882692.22774: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882692.22816: no more pending results, returning what we have 25201 1726882692.22821: results queue empty 25201 1726882692.22822: checking for any_errors_fatal 25201 1726882692.22828: done checking for any_errors_fatal 25201 1726882692.22829: checking for max_fail_percentage 25201 1726882692.22830: done checking for max_fail_percentage 25201 1726882692.22831: checking to see if all hosts have failed and the running result is not ok 25201 1726882692.22832: done checking to see if all hosts have failed 25201 1726882692.22833: getting the remaining hosts for this loop 25201 1726882692.22834: done getting the remaining hosts for this loop 25201 1726882692.22838: getting the next task for host managed_node2 25201 1726882692.22844: done getting next task for host managed_node2 25201 1726882692.22848: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25201 1726882692.22851: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882692.22862: getting variables 25201 1726882692.22866: in VariableManager get_vars() 25201 1726882692.23126: Calling all_inventory to load vars for managed_node2 25201 1726882692.23129: Calling groups_inventory to load vars for managed_node2 25201 1726882692.23136: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882692.23149: Calling all_plugins_play to load vars for managed_node2 25201 1726882692.23154: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882692.23158: Calling groups_plugins_play to load vars for managed_node2 25201 1726882692.24557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882692.26571: done with get_vars() 25201 1726882692.26588: done getting variables 25201 1726882692.26630: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:12 -0400 (0:00:00.110) 0:00:13.441 ****** 25201 1726882692.26667: entering _queue_task() for managed_node2/service 25201 1726882692.26928: worker is 1 (out of 1 available) 25201 1726882692.26942: exiting _queue_task() for managed_node2/service 25201 1726882692.26954: done queuing things up, now waiting for results queue to drain 25201 1726882692.26956: waiting for pending results... 25201 1726882692.27369: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25201 1726882692.27518: in run() - task 0e448fcc-3ce9-313b-197e-000000000023 25201 1726882692.27536: variable 'ansible_search_path' from source: unknown 25201 1726882692.27551: variable 'ansible_search_path' from source: unknown 25201 1726882692.27594: calling self._execute() 25201 1726882692.27694: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.27705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.27720: variable 'omit' from source: magic vars 25201 1726882692.28139: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.28159: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882692.28344: variable 'network_provider' from source: set_fact 25201 1726882692.28354: variable 'network_state' from source: role '' defaults 25201 1726882692.28370: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25201 1726882692.28380: variable 'omit' from source: magic vars 25201 1726882692.28449: variable 'omit' from source: magic vars 25201 1726882692.28497: variable 'network_service_name' from source: role '' defaults 25201 1726882692.28592: variable 'network_service_name' from source: role '' defaults 25201 1726882692.28715: variable '__network_provider_setup' from source: role '' defaults 25201 1726882692.28742: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882692.28870: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882692.28896: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882692.28989: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882692.29208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882692.30929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882692.31010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882692.31055: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882692.31098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882692.31133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882692.31215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.31253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.31285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.31334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.31357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.31422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.31453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.31475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.31504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.31541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.31694: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25201 1726882692.31773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.31792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.31810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.31834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.31844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.31908: variable 'ansible_python' from source: facts 25201 1726882692.31925: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25201 1726882692.31981: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882692.32037: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882692.32118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.32136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.32153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.32180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.32190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.32223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882692.32243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882692.32260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.32288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882692.32298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882692.32390: variable 'network_connections' from source: task vars 25201 1726882692.32396: variable 'interface' from source: play vars 25201 1726882692.32449: variable 'interface' from source: play vars 25201 1726882692.32523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882692.32643: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882692.32681: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882692.32711: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882692.32739: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882692.32786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882692.32807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882692.32829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882692.32852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882692.32889: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882692.33060: variable 'network_connections' from source: task vars 25201 1726882692.33066: variable 'interface' from source: play vars 25201 1726882692.33121: variable 'interface' from source: play vars 25201 1726882692.33154: variable '__network_packages_default_wireless' from source: role '' defaults 25201 1726882692.33211: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882692.33392: variable 'network_connections' from source: task vars 25201 1726882692.33397: variable 'interface' from source: play vars 25201 1726882692.33446: variable 'interface' from source: play vars 25201 1726882692.33472: variable '__network_packages_default_team' from source: role '' defaults 25201 1726882692.33523: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882692.33707: variable 'network_connections' from source: task vars 25201 1726882692.33710: variable 'interface' from source: play vars 25201 1726882692.33760: variable 'interface' from source: play vars 25201 1726882692.33805: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882692.33847: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882692.33856: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882692.33899: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882692.34035: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25201 1726882692.34346: variable 'network_connections' from source: task vars 25201 1726882692.34349: variable 'interface' from source: play vars 25201 1726882692.34399: variable 'interface' from source: play vars 25201 1726882692.34404: variable 'ansible_distribution' from source: facts 25201 1726882692.34408: variable '__network_rh_distros' from source: role '' defaults 25201 1726882692.34413: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.34431: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25201 1726882692.34543: variable 'ansible_distribution' from source: facts 25201 1726882692.34546: variable '__network_rh_distros' from source: role '' defaults 25201 1726882692.34551: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.34561: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25201 1726882692.34676: variable 'ansible_distribution' from source: facts 25201 1726882692.34679: variable '__network_rh_distros' from source: role '' defaults 25201 1726882692.34685: variable 'ansible_distribution_major_version' from source: facts 25201 1726882692.34710: variable 'network_provider' from source: set_fact 25201 1726882692.34727: variable 'omit' from source: magic vars 25201 1726882692.34746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882692.34769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882692.34781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882692.34794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882692.34802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882692.34824: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882692.34831: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.34833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.34899: Set connection var ansible_shell_executable to /bin/sh 25201 1726882692.34903: Set connection var ansible_pipelining to False 25201 1726882692.34908: Set connection var ansible_connection to ssh 25201 1726882692.34913: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882692.34915: Set connection var ansible_shell_type to sh 25201 1726882692.34922: Set connection var ansible_timeout to 10 25201 1726882692.34940: variable 'ansible_shell_executable' from source: unknown 25201 1726882692.34942: variable 'ansible_connection' from source: unknown 25201 1726882692.34945: variable 'ansible_module_compression' from source: unknown 25201 1726882692.34947: variable 'ansible_shell_type' from source: unknown 25201 1726882692.34949: variable 'ansible_shell_executable' from source: unknown 25201 1726882692.34951: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882692.34953: variable 'ansible_pipelining' from source: unknown 25201 1726882692.34961: variable 'ansible_timeout' from source: unknown 25201 1726882692.34966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882692.35027: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882692.35034: variable 'omit' from source: magic vars 25201 1726882692.35040: starting attempt loop 25201 1726882692.35046: running the handler 25201 1726882692.35098: variable 'ansible_facts' from source: unknown 25201 1726882692.35561: _low_level_execute_command(): starting 25201 1726882692.35573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882692.36058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.36069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.36097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.36110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.36159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882692.36179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.36288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882692.37952: stdout chunk (state=3): >>>/root <<< 25201 1726882692.38061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882692.38111: stderr chunk (state=3): >>><<< 25201 1726882692.38114: stdout chunk (state=3): >>><<< 25201 1726882692.38131: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882692.38140: _low_level_execute_command(): starting 25201 1726882692.38146: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205 `" && echo ansible-tmp-1726882692.3813097-25799-46531204678205="` echo /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205 `" ) && sleep 0' 25201 1726882692.38583: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.38603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.38615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.38630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882692.38640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.38686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882692.38694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.38819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882692.40676: stdout chunk (state=3): >>>ansible-tmp-1726882692.3813097-25799-46531204678205=/root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205 <<< 25201 1726882692.40784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882692.40828: stderr chunk (state=3): >>><<< 25201 1726882692.40831: stdout chunk (state=3): >>><<< 25201 1726882692.40843: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882692.3813097-25799-46531204678205=/root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882692.40873: variable 'ansible_module_compression' from source: unknown 25201 1726882692.40916: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 25201 1726882692.40920: ANSIBALLZ: Acquiring lock 25201 1726882692.40923: ANSIBALLZ: Lock acquired: 140300039193808 25201 1726882692.40925: ANSIBALLZ: Creating module 25201 1726882692.59278: ANSIBALLZ: Writing module into payload 25201 1726882692.59407: ANSIBALLZ: Writing module 25201 1726882692.59433: ANSIBALLZ: Renaming module 25201 1726882692.59439: ANSIBALLZ: Done creating module 25201 1726882692.59456: variable 'ansible_facts' from source: unknown 25201 1726882692.59562: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/AnsiballZ_systemd.py 25201 1726882692.59674: Sending initial data 25201 1726882692.59678: Sent initial data (155 bytes) 25201 1726882692.60347: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.60354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.60394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882692.60400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882692.60405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.60415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.60425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.60472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882692.60484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882692.60495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.60620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882692.62450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882692.62549: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882692.62645: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpuo_ejd0d /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/AnsiballZ_systemd.py <<< 25201 1726882692.62769: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882692.65692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882692.65776: stderr chunk (state=3): >>><<< 25201 1726882692.65788: stdout chunk (state=3): >>><<< 25201 1726882692.65819: done transferring module to remote 25201 1726882692.65833: _low_level_execute_command(): starting 25201 1726882692.65842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/ /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/AnsiballZ_systemd.py && sleep 0' 25201 1726882692.66513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882692.66525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.66539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.66555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.66602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.66613: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882692.66626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.66642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882692.66652: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882692.66662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882692.66682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.66695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.66710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.66721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.66731: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882692.66742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.66824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882692.66844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882692.66858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.66988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882692.68831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882692.68834: stdout chunk (state=3): >>><<< 25201 1726882692.68837: stderr chunk (state=3): >>><<< 25201 1726882692.68870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882692.68873: _low_level_execute_command(): starting 25201 1726882692.68876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/AnsiballZ_systemd.py && sleep 0' 25201 1726882692.69501: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882692.69518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.69535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.69554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.69600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.69613: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882692.69630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.69651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882692.69665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882692.69682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882692.69695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.69709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.69725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.69736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.69751: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882692.69766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.69842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882692.69861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882692.69880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.70145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882692.94942: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "1757735000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 25201 1726882692.94960: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25201 1726882692.96500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882692.96503: stdout chunk (state=3): >>><<< 25201 1726882692.96506: stderr chunk (state=3): >>><<< 25201 1726882692.96769: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "1757735000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882692.96778: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882692.96781: _low_level_execute_command(): starting 25201 1726882692.96783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882692.3813097-25799-46531204678205/ > /dev/null 2>&1 && sleep 0' 25201 1726882692.97378: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882692.97519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.97638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.97658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.97703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.97716: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882692.97738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.97756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882692.97771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882692.97785: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882692.97798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882692.97812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882692.97828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882692.97846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882692.97859: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882692.97879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882692.97978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882692.98000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882692.98018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882692.98145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882693.00007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882693.00010: stdout chunk (state=3): >>><<< 25201 1726882693.00012: stderr chunk (state=3): >>><<< 25201 1726882693.00169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882693.00172: handler run complete 25201 1726882693.00175: attempt loop complete, returning result 25201 1726882693.00177: _execute() done 25201 1726882693.00179: dumping result to json 25201 1726882693.00181: done dumping result, returning 25201 1726882693.00183: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-313b-197e-000000000023] 25201 1726882693.00185: sending task result for task 0e448fcc-3ce9-313b-197e-000000000023 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882693.00476: no more pending results, returning what we have 25201 1726882693.00479: results queue empty 25201 1726882693.00480: checking for any_errors_fatal 25201 1726882693.00485: done checking for any_errors_fatal 25201 1726882693.00486: checking for max_fail_percentage 25201 1726882693.00488: done checking for max_fail_percentage 25201 1726882693.00489: checking to see if all hosts have failed and the running result is not ok 25201 1726882693.00490: done checking to see if all hosts have failed 25201 1726882693.00490: getting the remaining hosts for this loop 25201 1726882693.00492: done getting the remaining hosts for this loop 25201 1726882693.00495: getting the next task for host managed_node2 25201 1726882693.00502: done getting next task for host managed_node2 25201 1726882693.00506: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25201 1726882693.00509: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882693.00519: getting variables 25201 1726882693.00521: in VariableManager get_vars() 25201 1726882693.00558: Calling all_inventory to load vars for managed_node2 25201 1726882693.00561: Calling groups_inventory to load vars for managed_node2 25201 1726882693.00565: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882693.00575: Calling all_plugins_play to load vars for managed_node2 25201 1726882693.00577: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882693.00580: Calling groups_plugins_play to load vars for managed_node2 25201 1726882693.01469: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000023 25201 1726882693.01475: WORKER PROCESS EXITING 25201 1726882693.02523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882693.04403: done with get_vars() 25201 1726882693.04423: done getting variables 25201 1726882693.04483: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:13 -0400 (0:00:00.778) 0:00:14.219 ****** 25201 1726882693.04525: entering _queue_task() for managed_node2/service 25201 1726882693.04805: worker is 1 (out of 1 available) 25201 1726882693.04817: exiting _queue_task() for managed_node2/service 25201 1726882693.04832: done queuing things up, now waiting for results queue to drain 25201 1726882693.04834: waiting for pending results... 25201 1726882693.05102: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25201 1726882693.05241: in run() - task 0e448fcc-3ce9-313b-197e-000000000024 25201 1726882693.05259: variable 'ansible_search_path' from source: unknown 25201 1726882693.05274: variable 'ansible_search_path' from source: unknown 25201 1726882693.05313: calling self._execute() 25201 1726882693.05406: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.05417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.05430: variable 'omit' from source: magic vars 25201 1726882693.05798: variable 'ansible_distribution_major_version' from source: facts 25201 1726882693.05822: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882693.05951: variable 'network_provider' from source: set_fact 25201 1726882693.05961: Evaluated conditional (network_provider == "nm"): True 25201 1726882693.06068: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882693.06167: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882693.06342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882693.08622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882693.08701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882693.08742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882693.08788: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882693.08819: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882693.08918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882693.08953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882693.08996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882693.09043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882693.09060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882693.09115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882693.09140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882693.09167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882693.09216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882693.09232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882693.09274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882693.09300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882693.09332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882693.09374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882693.09389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882693.09542: variable 'network_connections' from source: task vars 25201 1726882693.09558: variable 'interface' from source: play vars 25201 1726882693.09643: variable 'interface' from source: play vars 25201 1726882693.09722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882693.09895: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882693.09934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882693.09979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882693.10012: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882693.10060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882693.10095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882693.10124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882693.10154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882693.10215: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882693.10483: variable 'network_connections' from source: task vars 25201 1726882693.10494: variable 'interface' from source: play vars 25201 1726882693.10567: variable 'interface' from source: play vars 25201 1726882693.10618: Evaluated conditional (__network_wpa_supplicant_required): False 25201 1726882693.10626: when evaluation is False, skipping this task 25201 1726882693.10634: _execute() done 25201 1726882693.10640: dumping result to json 25201 1726882693.10648: done dumping result, returning 25201 1726882693.10660: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-313b-197e-000000000024] 25201 1726882693.10682: sending task result for task 0e448fcc-3ce9-313b-197e-000000000024 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25201 1726882693.10832: no more pending results, returning what we have 25201 1726882693.10835: results queue empty 25201 1726882693.10837: checking for any_errors_fatal 25201 1726882693.10862: done checking for any_errors_fatal 25201 1726882693.10863: checking for max_fail_percentage 25201 1726882693.10868: done checking for max_fail_percentage 25201 1726882693.10869: checking to see if all hosts have failed and the running result is not ok 25201 1726882693.10870: done checking to see if all hosts have failed 25201 1726882693.10870: getting the remaining hosts for this loop 25201 1726882693.10872: done getting the remaining hosts for this loop 25201 1726882693.10876: getting the next task for host managed_node2 25201 1726882693.10883: done getting next task for host managed_node2 25201 1726882693.10887: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25201 1726882693.10892: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882693.10907: getting variables 25201 1726882693.10910: in VariableManager get_vars() 25201 1726882693.10953: Calling all_inventory to load vars for managed_node2 25201 1726882693.10957: Calling groups_inventory to load vars for managed_node2 25201 1726882693.10960: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882693.10972: Calling all_plugins_play to load vars for managed_node2 25201 1726882693.10976: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882693.10979: Calling groups_plugins_play to load vars for managed_node2 25201 1726882693.11984: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000024 25201 1726882693.11988: WORKER PROCESS EXITING 25201 1726882693.12741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882693.14912: done with get_vars() 25201 1726882693.14941: done getting variables 25201 1726882693.15001: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:13 -0400 (0:00:00.105) 0:00:14.324 ****** 25201 1726882693.15033: entering _queue_task() for managed_node2/service 25201 1726882693.15356: worker is 1 (out of 1 available) 25201 1726882693.15390: exiting _queue_task() for managed_node2/service 25201 1726882693.15409: done queuing things up, now waiting for results queue to drain 25201 1726882693.15411: waiting for pending results... 25201 1726882693.15694: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 25201 1726882693.15834: in run() - task 0e448fcc-3ce9-313b-197e-000000000025 25201 1726882693.15860: variable 'ansible_search_path' from source: unknown 25201 1726882693.15871: variable 'ansible_search_path' from source: unknown 25201 1726882693.15912: calling self._execute() 25201 1726882693.16011: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.16023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.16044: variable 'omit' from source: magic vars 25201 1726882693.16426: variable 'ansible_distribution_major_version' from source: facts 25201 1726882693.16445: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882693.16577: variable 'network_provider' from source: set_fact 25201 1726882693.16590: Evaluated conditional (network_provider == "initscripts"): False 25201 1726882693.16598: when evaluation is False, skipping this task 25201 1726882693.16604: _execute() done 25201 1726882693.16614: dumping result to json 25201 1726882693.16621: done dumping result, returning 25201 1726882693.16632: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-313b-197e-000000000025] 25201 1726882693.16642: sending task result for task 0e448fcc-3ce9-313b-197e-000000000025 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882693.16797: no more pending results, returning what we have 25201 1726882693.16802: results queue empty 25201 1726882693.16803: checking for any_errors_fatal 25201 1726882693.16810: done checking for any_errors_fatal 25201 1726882693.16811: checking for max_fail_percentage 25201 1726882693.16813: done checking for max_fail_percentage 25201 1726882693.16815: checking to see if all hosts have failed and the running result is not ok 25201 1726882693.16816: done checking to see if all hosts have failed 25201 1726882693.16816: getting the remaining hosts for this loop 25201 1726882693.16818: done getting the remaining hosts for this loop 25201 1726882693.16822: getting the next task for host managed_node2 25201 1726882693.16831: done getting next task for host managed_node2 25201 1726882693.16835: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25201 1726882693.16838: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882693.16857: getting variables 25201 1726882693.16859: in VariableManager get_vars() 25201 1726882693.16904: Calling all_inventory to load vars for managed_node2 25201 1726882693.16907: Calling groups_inventory to load vars for managed_node2 25201 1726882693.16910: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882693.16923: Calling all_plugins_play to load vars for managed_node2 25201 1726882693.16926: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882693.16929: Calling groups_plugins_play to load vars for managed_node2 25201 1726882693.19274: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000025 25201 1726882693.19278: WORKER PROCESS EXITING 25201 1726882693.20673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882693.24115: done with get_vars() 25201 1726882693.24140: done getting variables 25201 1726882693.24199: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:13 -0400 (0:00:00.091) 0:00:14.416 ****** 25201 1726882693.24234: entering _queue_task() for managed_node2/copy 25201 1726882693.24523: worker is 1 (out of 1 available) 25201 1726882693.24535: exiting _queue_task() for managed_node2/copy 25201 1726882693.24545: done queuing things up, now waiting for results queue to drain 25201 1726882693.24546: waiting for pending results... 25201 1726882693.25445: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25201 1726882693.25757: in run() - task 0e448fcc-3ce9-313b-197e-000000000026 25201 1726882693.25777: variable 'ansible_search_path' from source: unknown 25201 1726882693.25842: variable 'ansible_search_path' from source: unknown 25201 1726882693.25882: calling self._execute() 25201 1726882693.26083: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.26094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.26107: variable 'omit' from source: magic vars 25201 1726882693.26797: variable 'ansible_distribution_major_version' from source: facts 25201 1726882693.26937: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882693.27099: variable 'network_provider' from source: set_fact 25201 1726882693.27148: Evaluated conditional (network_provider == "initscripts"): False 25201 1726882693.27158: when evaluation is False, skipping this task 25201 1726882693.27169: _execute() done 25201 1726882693.27179: dumping result to json 25201 1726882693.27188: done dumping result, returning 25201 1726882693.27202: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-313b-197e-000000000026] 25201 1726882693.27257: sending task result for task 0e448fcc-3ce9-313b-197e-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25201 1726882693.27408: no more pending results, returning what we have 25201 1726882693.27412: results queue empty 25201 1726882693.27413: checking for any_errors_fatal 25201 1726882693.27420: done checking for any_errors_fatal 25201 1726882693.27421: checking for max_fail_percentage 25201 1726882693.27423: done checking for max_fail_percentage 25201 1726882693.27424: checking to see if all hosts have failed and the running result is not ok 25201 1726882693.27425: done checking to see if all hosts have failed 25201 1726882693.27426: getting the remaining hosts for this loop 25201 1726882693.27427: done getting the remaining hosts for this loop 25201 1726882693.27431: getting the next task for host managed_node2 25201 1726882693.27439: done getting next task for host managed_node2 25201 1726882693.27443: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25201 1726882693.27446: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882693.27462: getting variables 25201 1726882693.27466: in VariableManager get_vars() 25201 1726882693.27508: Calling all_inventory to load vars for managed_node2 25201 1726882693.27511: Calling groups_inventory to load vars for managed_node2 25201 1726882693.27513: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882693.27524: Calling all_plugins_play to load vars for managed_node2 25201 1726882693.27526: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882693.27528: Calling groups_plugins_play to load vars for managed_node2 25201 1726882693.29083: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000026 25201 1726882693.29087: WORKER PROCESS EXITING 25201 1726882693.29679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882693.33445: done with get_vars() 25201 1726882693.33586: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:13 -0400 (0:00:00.094) 0:00:14.511 ****** 25201 1726882693.33673: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25201 1726882693.33788: Creating lock for fedora.linux_system_roles.network_connections 25201 1726882693.34310: worker is 1 (out of 1 available) 25201 1726882693.34321: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25201 1726882693.34447: done queuing things up, now waiting for results queue to drain 25201 1726882693.34449: waiting for pending results... 25201 1726882693.35128: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25201 1726882693.35344: in run() - task 0e448fcc-3ce9-313b-197e-000000000027 25201 1726882693.35362: variable 'ansible_search_path' from source: unknown 25201 1726882693.35372: variable 'ansible_search_path' from source: unknown 25201 1726882693.35417: calling self._execute() 25201 1726882693.35510: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.35526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.35540: variable 'omit' from source: magic vars 25201 1726882693.35957: variable 'ansible_distribution_major_version' from source: facts 25201 1726882693.35978: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882693.35989: variable 'omit' from source: magic vars 25201 1726882693.36051: variable 'omit' from source: magic vars 25201 1726882693.36219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882693.39081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882693.39313: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882693.39353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882693.39398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882693.39428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882693.39530: variable 'network_provider' from source: set_fact 25201 1726882693.39670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882693.39713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882693.39746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882693.39794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882693.39821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882693.39898: variable 'omit' from source: magic vars 25201 1726882693.40018: variable 'omit' from source: magic vars 25201 1726882693.40125: variable 'network_connections' from source: task vars 25201 1726882693.40145: variable 'interface' from source: play vars 25201 1726882693.40215: variable 'interface' from source: play vars 25201 1726882693.40389: variable 'omit' from source: magic vars 25201 1726882693.40402: variable '__lsr_ansible_managed' from source: task vars 25201 1726882693.40472: variable '__lsr_ansible_managed' from source: task vars 25201 1726882693.40839: Loaded config def from plugin (lookup/template) 25201 1726882693.40859: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25201 1726882693.40932: File lookup term: get_ansible_managed.j2 25201 1726882693.40939: variable 'ansible_search_path' from source: unknown 25201 1726882693.40947: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25201 1726882693.40965: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25201 1726882693.40986: variable 'ansible_search_path' from source: unknown 25201 1726882693.48937: variable 'ansible_managed' from source: unknown 25201 1726882693.49882: variable 'omit' from source: magic vars 25201 1726882693.49921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882693.49961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882693.49987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882693.50014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882693.50044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882693.50078: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882693.50087: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.50095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.50326: Set connection var ansible_shell_executable to /bin/sh 25201 1726882693.50336: Set connection var ansible_pipelining to False 25201 1726882693.50344: Set connection var ansible_connection to ssh 25201 1726882693.50352: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882693.50357: Set connection var ansible_shell_type to sh 25201 1726882693.50369: Set connection var ansible_timeout to 10 25201 1726882693.50404: variable 'ansible_shell_executable' from source: unknown 25201 1726882693.50430: variable 'ansible_connection' from source: unknown 25201 1726882693.50438: variable 'ansible_module_compression' from source: unknown 25201 1726882693.50445: variable 'ansible_shell_type' from source: unknown 25201 1726882693.50451: variable 'ansible_shell_executable' from source: unknown 25201 1726882693.50458: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882693.50469: variable 'ansible_pipelining' from source: unknown 25201 1726882693.50476: variable 'ansible_timeout' from source: unknown 25201 1726882693.50484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882693.50620: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882693.50650: variable 'omit' from source: magic vars 25201 1726882693.50661: starting attempt loop 25201 1726882693.50673: running the handler 25201 1726882693.50691: _low_level_execute_command(): starting 25201 1726882693.50704: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882693.52128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882693.52143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.52165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.52190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.52231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.52244: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882693.52259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.52285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882693.52300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882693.52312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882693.52325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.52339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.52355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.52369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.52385: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882693.52403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.52477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882693.52505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882693.52526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882693.52659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882693.54324: stdout chunk (state=3): >>>/root <<< 25201 1726882693.54454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882693.54555: stderr chunk (state=3): >>><<< 25201 1726882693.54558: stdout chunk (state=3): >>><<< 25201 1726882693.54675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882693.54679: _low_level_execute_command(): starting 25201 1726882693.54682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228 `" && echo ansible-tmp-1726882693.545883-25841-161626603471228="` echo /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228 `" ) && sleep 0' 25201 1726882693.55351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882693.55376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.55393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.55413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.55478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.55496: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882693.55509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.55534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882693.55562: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882693.55578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882693.55604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.55623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.55645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.55661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.55694: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882693.55710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.55795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882693.55816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882693.55843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882693.55977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882693.57870: stdout chunk (state=3): >>>ansible-tmp-1726882693.545883-25841-161626603471228=/root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228 <<< 25201 1726882693.58000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882693.58082: stderr chunk (state=3): >>><<< 25201 1726882693.58101: stdout chunk (state=3): >>><<< 25201 1726882693.58170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882693.545883-25841-161626603471228=/root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882693.58276: variable 'ansible_module_compression' from source: unknown 25201 1726882693.58279: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 25201 1726882693.58281: ANSIBALLZ: Acquiring lock 25201 1726882693.58283: ANSIBALLZ: Lock acquired: 140300033915600 25201 1726882693.58285: ANSIBALLZ: Creating module 25201 1726882693.82503: ANSIBALLZ: Writing module into payload 25201 1726882693.83032: ANSIBALLZ: Writing module 25201 1726882693.83060: ANSIBALLZ: Renaming module 25201 1726882693.83063: ANSIBALLZ: Done creating module 25201 1726882693.83091: variable 'ansible_facts' from source: unknown 25201 1726882693.83183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/AnsiballZ_network_connections.py 25201 1726882693.83322: Sending initial data 25201 1726882693.83325: Sent initial data (167 bytes) 25201 1726882693.84237: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882693.84784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.84795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.84810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.84846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.84857: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882693.84860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.84881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882693.84888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882693.84895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882693.84903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.84912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.84927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.84930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.84935: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882693.84945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.85023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882693.85086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882693.85096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882693.85385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882693.87220: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882693.87323: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882693.87435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpifr_pt58 /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/AnsiballZ_network_connections.py <<< 25201 1726882693.87532: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882693.89613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882693.89692: stderr chunk (state=3): >>><<< 25201 1726882693.89695: stdout chunk (state=3): >>><<< 25201 1726882693.89720: done transferring module to remote 25201 1726882693.89732: _low_level_execute_command(): starting 25201 1726882693.89735: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/ /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/AnsiballZ_network_connections.py && sleep 0' 25201 1726882693.91489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882693.91498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.91508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.91521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.91558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.91569: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882693.91577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.91593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882693.91599: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882693.91606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882693.91613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.91622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.91633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.91640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.91646: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882693.91655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.91733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882693.91746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882693.91756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882693.92043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882693.93929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882693.93932: stdout chunk (state=3): >>><<< 25201 1726882693.93939: stderr chunk (state=3): >>><<< 25201 1726882693.93959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882693.93969: _low_level_execute_command(): starting 25201 1726882693.93972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/AnsiballZ_network_connections.py && sleep 0' 25201 1726882693.95517: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882693.95523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.95533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.95546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.95585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.95706: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882693.95715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.95729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882693.95734: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882693.95741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882693.95747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882693.95756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882693.95768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882693.95775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882693.95781: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882693.95790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882693.95874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882693.95882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882693.95886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882693.96148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882695.60640: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25201 1726882695.62516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882695.62577: stderr chunk (state=3): >>><<< 25201 1726882695.62581: stdout chunk (state=3): >>><<< 25201 1726882695.62598: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882695.62631: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882695.62639: _low_level_execute_command(): starting 25201 1726882695.62645: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882693.545883-25841-161626603471228/ > /dev/null 2>&1 && sleep 0' 25201 1726882695.63144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882695.63150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882695.63195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882695.63203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882695.63220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882695.63225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882695.63357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882695.63361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882695.63461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882695.65296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882695.65339: stderr chunk (state=3): >>><<< 25201 1726882695.65342: stdout chunk (state=3): >>><<< 25201 1726882695.65355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882695.65361: handler run complete 25201 1726882695.65392: attempt loop complete, returning result 25201 1726882695.65395: _execute() done 25201 1726882695.65398: dumping result to json 25201 1726882695.65402: done dumping result, returning 25201 1726882695.65410: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-313b-197e-000000000027] 25201 1726882695.65415: sending task result for task 0e448fcc-3ce9-313b-197e-000000000027 25201 1726882695.65519: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000027 25201 1726882695.65521: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active) 25201 1726882695.65632: no more pending results, returning what we have 25201 1726882695.65635: results queue empty 25201 1726882695.65636: checking for any_errors_fatal 25201 1726882695.65641: done checking for any_errors_fatal 25201 1726882695.65641: checking for max_fail_percentage 25201 1726882695.65643: done checking for max_fail_percentage 25201 1726882695.65644: checking to see if all hosts have failed and the running result is not ok 25201 1726882695.65645: done checking to see if all hosts have failed 25201 1726882695.65646: getting the remaining hosts for this loop 25201 1726882695.65647: done getting the remaining hosts for this loop 25201 1726882695.65651: getting the next task for host managed_node2 25201 1726882695.65656: done getting next task for host managed_node2 25201 1726882695.65660: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25201 1726882695.65666: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882695.65677: getting variables 25201 1726882695.65678: in VariableManager get_vars() 25201 1726882695.65717: Calling all_inventory to load vars for managed_node2 25201 1726882695.65720: Calling groups_inventory to load vars for managed_node2 25201 1726882695.65722: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882695.65731: Calling all_plugins_play to load vars for managed_node2 25201 1726882695.65733: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882695.65736: Calling groups_plugins_play to load vars for managed_node2 25201 1726882695.67339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882695.69141: done with get_vars() 25201 1726882695.69166: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:15 -0400 (0:00:02.355) 0:00:16.867 ****** 25201 1726882695.69253: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25201 1726882695.69255: Creating lock for fedora.linux_system_roles.network_state 25201 1726882695.69577: worker is 1 (out of 1 available) 25201 1726882695.69592: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25201 1726882695.69605: done queuing things up, now waiting for results queue to drain 25201 1726882695.69607: waiting for pending results... 25201 1726882695.69899: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 25201 1726882695.70038: in run() - task 0e448fcc-3ce9-313b-197e-000000000028 25201 1726882695.70068: variable 'ansible_search_path' from source: unknown 25201 1726882695.70079: variable 'ansible_search_path' from source: unknown 25201 1726882695.70125: calling self._execute() 25201 1726882695.70221: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.70234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.70248: variable 'omit' from source: magic vars 25201 1726882695.70636: variable 'ansible_distribution_major_version' from source: facts 25201 1726882695.70656: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882695.70792: variable 'network_state' from source: role '' defaults 25201 1726882695.70811: Evaluated conditional (network_state != {}): False 25201 1726882695.70819: when evaluation is False, skipping this task 25201 1726882695.70827: _execute() done 25201 1726882695.70833: dumping result to json 25201 1726882695.70839: done dumping result, returning 25201 1726882695.70848: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-313b-197e-000000000028] 25201 1726882695.70866: sending task result for task 0e448fcc-3ce9-313b-197e-000000000028 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882695.71011: no more pending results, returning what we have 25201 1726882695.71017: results queue empty 25201 1726882695.71018: checking for any_errors_fatal 25201 1726882695.71025: done checking for any_errors_fatal 25201 1726882695.71026: checking for max_fail_percentage 25201 1726882695.71028: done checking for max_fail_percentage 25201 1726882695.71029: checking to see if all hosts have failed and the running result is not ok 25201 1726882695.71030: done checking to see if all hosts have failed 25201 1726882695.71031: getting the remaining hosts for this loop 25201 1726882695.71032: done getting the remaining hosts for this loop 25201 1726882695.71036: getting the next task for host managed_node2 25201 1726882695.71043: done getting next task for host managed_node2 25201 1726882695.71047: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25201 1726882695.71050: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882695.71068: getting variables 25201 1726882695.71070: in VariableManager get_vars() 25201 1726882695.71108: Calling all_inventory to load vars for managed_node2 25201 1726882695.71110: Calling groups_inventory to load vars for managed_node2 25201 1726882695.71113: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882695.71125: Calling all_plugins_play to load vars for managed_node2 25201 1726882695.71128: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882695.71131: Calling groups_plugins_play to load vars for managed_node2 25201 1726882695.72104: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000028 25201 1726882695.72107: WORKER PROCESS EXITING 25201 1726882695.72820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882695.74743: done with get_vars() 25201 1726882695.74767: done getting variables 25201 1726882695.74823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:15 -0400 (0:00:00.056) 0:00:16.923 ****** 25201 1726882695.74860: entering _queue_task() for managed_node2/debug 25201 1726882695.75148: worker is 1 (out of 1 available) 25201 1726882695.75166: exiting _queue_task() for managed_node2/debug 25201 1726882695.75179: done queuing things up, now waiting for results queue to drain 25201 1726882695.75181: waiting for pending results... 25201 1726882695.75447: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25201 1726882695.75582: in run() - task 0e448fcc-3ce9-313b-197e-000000000029 25201 1726882695.75605: variable 'ansible_search_path' from source: unknown 25201 1726882695.75613: variable 'ansible_search_path' from source: unknown 25201 1726882695.75653: calling self._execute() 25201 1726882695.75753: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.75768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.75782: variable 'omit' from source: magic vars 25201 1726882695.76168: variable 'ansible_distribution_major_version' from source: facts 25201 1726882695.76188: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882695.76200: variable 'omit' from source: magic vars 25201 1726882695.76259: variable 'omit' from source: magic vars 25201 1726882695.76305: variable 'omit' from source: magic vars 25201 1726882695.76348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882695.76398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882695.76421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882695.76442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.76457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.76501: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882695.76509: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.76516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.76628: Set connection var ansible_shell_executable to /bin/sh 25201 1726882695.76638: Set connection var ansible_pipelining to False 25201 1726882695.76646: Set connection var ansible_connection to ssh 25201 1726882695.76655: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882695.76661: Set connection var ansible_shell_type to sh 25201 1726882695.76678: Set connection var ansible_timeout to 10 25201 1726882695.76710: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.76717: variable 'ansible_connection' from source: unknown 25201 1726882695.76723: variable 'ansible_module_compression' from source: unknown 25201 1726882695.76729: variable 'ansible_shell_type' from source: unknown 25201 1726882695.76734: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.76740: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.76746: variable 'ansible_pipelining' from source: unknown 25201 1726882695.76752: variable 'ansible_timeout' from source: unknown 25201 1726882695.76758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.76906: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882695.76927: variable 'omit' from source: magic vars 25201 1726882695.76935: starting attempt loop 25201 1726882695.76941: running the handler 25201 1726882695.77087: variable '__network_connections_result' from source: set_fact 25201 1726882695.77149: handler run complete 25201 1726882695.77176: attempt loop complete, returning result 25201 1726882695.77183: _execute() done 25201 1726882695.77189: dumping result to json 25201 1726882695.77196: done dumping result, returning 25201 1726882695.77208: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-313b-197e-000000000029] 25201 1726882695.77218: sending task result for task 0e448fcc-3ce9-313b-197e-000000000029 25201 1726882695.77321: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000029 25201 1726882695.77328: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active)" ] } 25201 1726882695.77417: no more pending results, returning what we have 25201 1726882695.77420: results queue empty 25201 1726882695.77421: checking for any_errors_fatal 25201 1726882695.77426: done checking for any_errors_fatal 25201 1726882695.77427: checking for max_fail_percentage 25201 1726882695.77429: done checking for max_fail_percentage 25201 1726882695.77430: checking to see if all hosts have failed and the running result is not ok 25201 1726882695.77430: done checking to see if all hosts have failed 25201 1726882695.77431: getting the remaining hosts for this loop 25201 1726882695.77433: done getting the remaining hosts for this loop 25201 1726882695.77437: getting the next task for host managed_node2 25201 1726882695.77444: done getting next task for host managed_node2 25201 1726882695.77448: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25201 1726882695.77452: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882695.77467: getting variables 25201 1726882695.77470: in VariableManager get_vars() 25201 1726882695.77510: Calling all_inventory to load vars for managed_node2 25201 1726882695.77513: Calling groups_inventory to load vars for managed_node2 25201 1726882695.77516: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882695.77526: Calling all_plugins_play to load vars for managed_node2 25201 1726882695.77529: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882695.77532: Calling groups_plugins_play to load vars for managed_node2 25201 1726882695.79190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882695.80999: done with get_vars() 25201 1726882695.81023: done getting variables 25201 1726882695.81081: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:15 -0400 (0:00:00.062) 0:00:16.985 ****** 25201 1726882695.81112: entering _queue_task() for managed_node2/debug 25201 1726882695.81371: worker is 1 (out of 1 available) 25201 1726882695.81384: exiting _queue_task() for managed_node2/debug 25201 1726882695.81395: done queuing things up, now waiting for results queue to drain 25201 1726882695.81396: waiting for pending results... 25201 1726882695.81662: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25201 1726882695.81799: in run() - task 0e448fcc-3ce9-313b-197e-00000000002a 25201 1726882695.81816: variable 'ansible_search_path' from source: unknown 25201 1726882695.81822: variable 'ansible_search_path' from source: unknown 25201 1726882695.81861: calling self._execute() 25201 1726882695.81960: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.81977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.81991: variable 'omit' from source: magic vars 25201 1726882695.82373: variable 'ansible_distribution_major_version' from source: facts 25201 1726882695.82395: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882695.82405: variable 'omit' from source: magic vars 25201 1726882695.82471: variable 'omit' from source: magic vars 25201 1726882695.82512: variable 'omit' from source: magic vars 25201 1726882695.82558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882695.82603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882695.82627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882695.82653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.82677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.82713: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882695.82720: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.82727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.82837: Set connection var ansible_shell_executable to /bin/sh 25201 1726882695.82849: Set connection var ansible_pipelining to False 25201 1726882695.82860: Set connection var ansible_connection to ssh 25201 1726882695.82880: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882695.82887: Set connection var ansible_shell_type to sh 25201 1726882695.82898: Set connection var ansible_timeout to 10 25201 1726882695.82926: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.82933: variable 'ansible_connection' from source: unknown 25201 1726882695.82939: variable 'ansible_module_compression' from source: unknown 25201 1726882695.82945: variable 'ansible_shell_type' from source: unknown 25201 1726882695.82950: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.82956: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.82966: variable 'ansible_pipelining' from source: unknown 25201 1726882695.82973: variable 'ansible_timeout' from source: unknown 25201 1726882695.82984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.83122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882695.83143: variable 'omit' from source: magic vars 25201 1726882695.83152: starting attempt loop 25201 1726882695.83158: running the handler 25201 1726882695.83313: variable '__network_connections_result' from source: set_fact 25201 1726882695.83395: variable '__network_connections_result' from source: set_fact 25201 1726882695.83540: handler run complete 25201 1726882695.83576: attempt loop complete, returning result 25201 1726882695.83583: _execute() done 25201 1726882695.83592: dumping result to json 25201 1726882695.83600: done dumping result, returning 25201 1726882695.83610: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-313b-197e-00000000002a] 25201 1726882695.83619: sending task result for task 0e448fcc-3ce9-313b-197e-00000000002a ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 283a7ffe-9cfe-42e8-8b60-ec161f169c65 (not-active)" ] } } 25201 1726882695.83815: no more pending results, returning what we have 25201 1726882695.83819: results queue empty 25201 1726882695.83820: checking for any_errors_fatal 25201 1726882695.83824: done checking for any_errors_fatal 25201 1726882695.83825: checking for max_fail_percentage 25201 1726882695.83827: done checking for max_fail_percentage 25201 1726882695.83828: checking to see if all hosts have failed and the running result is not ok 25201 1726882695.83829: done checking to see if all hosts have failed 25201 1726882695.83829: getting the remaining hosts for this loop 25201 1726882695.83831: done getting the remaining hosts for this loop 25201 1726882695.83835: getting the next task for host managed_node2 25201 1726882695.83842: done getting next task for host managed_node2 25201 1726882695.83845: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25201 1726882695.83848: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882695.83858: getting variables 25201 1726882695.83860: in VariableManager get_vars() 25201 1726882695.83901: Calling all_inventory to load vars for managed_node2 25201 1726882695.83904: Calling groups_inventory to load vars for managed_node2 25201 1726882695.83906: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882695.83921: Calling all_plugins_play to load vars for managed_node2 25201 1726882695.83924: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882695.83927: Calling groups_plugins_play to load vars for managed_node2 25201 1726882695.85411: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000002a 25201 1726882695.85414: WORKER PROCESS EXITING 25201 1726882695.87189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882695.89946: done with get_vars() 25201 1726882695.89975: done getting variables 25201 1726882695.90039: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:15 -0400 (0:00:00.089) 0:00:17.075 ****** 25201 1726882695.90809: entering _queue_task() for managed_node2/debug 25201 1726882695.91338: worker is 1 (out of 1 available) 25201 1726882695.91469: exiting _queue_task() for managed_node2/debug 25201 1726882695.91575: done queuing things up, now waiting for results queue to drain 25201 1726882695.91577: waiting for pending results... 25201 1726882695.92226: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25201 1726882695.92342: in run() - task 0e448fcc-3ce9-313b-197e-00000000002b 25201 1726882695.92356: variable 'ansible_search_path' from source: unknown 25201 1726882695.92361: variable 'ansible_search_path' from source: unknown 25201 1726882695.92402: calling self._execute() 25201 1726882695.92489: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.92495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.92607: variable 'omit' from source: magic vars 25201 1726882695.92982: variable 'ansible_distribution_major_version' from source: facts 25201 1726882695.92994: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882695.93572: variable 'network_state' from source: role '' defaults 25201 1726882695.93575: Evaluated conditional (network_state != {}): False 25201 1726882695.93577: when evaluation is False, skipping this task 25201 1726882695.93579: _execute() done 25201 1726882695.93581: dumping result to json 25201 1726882695.93583: done dumping result, returning 25201 1726882695.93585: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-313b-197e-00000000002b] 25201 1726882695.93587: sending task result for task 0e448fcc-3ce9-313b-197e-00000000002b 25201 1726882695.93650: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000002b 25201 1726882695.93653: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 25201 1726882695.93711: no more pending results, returning what we have 25201 1726882695.93714: results queue empty 25201 1726882695.93715: checking for any_errors_fatal 25201 1726882695.93721: done checking for any_errors_fatal 25201 1726882695.93722: checking for max_fail_percentage 25201 1726882695.93724: done checking for max_fail_percentage 25201 1726882695.93725: checking to see if all hosts have failed and the running result is not ok 25201 1726882695.93726: done checking to see if all hosts have failed 25201 1726882695.93727: getting the remaining hosts for this loop 25201 1726882695.93728: done getting the remaining hosts for this loop 25201 1726882695.93731: getting the next task for host managed_node2 25201 1726882695.93737: done getting next task for host managed_node2 25201 1726882695.93740: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25201 1726882695.93743: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882695.93755: getting variables 25201 1726882695.93756: in VariableManager get_vars() 25201 1726882695.93794: Calling all_inventory to load vars for managed_node2 25201 1726882695.93797: Calling groups_inventory to load vars for managed_node2 25201 1726882695.93800: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882695.93809: Calling all_plugins_play to load vars for managed_node2 25201 1726882695.93812: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882695.93815: Calling groups_plugins_play to load vars for managed_node2 25201 1726882695.95205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882695.97606: done with get_vars() 25201 1726882695.97632: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:15 -0400 (0:00:00.076) 0:00:17.151 ****** 25201 1726882695.97732: entering _queue_task() for managed_node2/ping 25201 1726882695.97734: Creating lock for ping 25201 1726882695.98013: worker is 1 (out of 1 available) 25201 1726882695.98025: exiting _queue_task() for managed_node2/ping 25201 1726882695.98036: done queuing things up, now waiting for results queue to drain 25201 1726882695.98037: waiting for pending results... 25201 1726882695.98480: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25201 1726882695.98613: in run() - task 0e448fcc-3ce9-313b-197e-00000000002c 25201 1726882695.98636: variable 'ansible_search_path' from source: unknown 25201 1726882695.98648: variable 'ansible_search_path' from source: unknown 25201 1726882695.98695: calling self._execute() 25201 1726882695.98789: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.98801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.98817: variable 'omit' from source: magic vars 25201 1726882695.99210: variable 'ansible_distribution_major_version' from source: facts 25201 1726882695.99230: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882695.99240: variable 'omit' from source: magic vars 25201 1726882695.99311: variable 'omit' from source: magic vars 25201 1726882695.99349: variable 'omit' from source: magic vars 25201 1726882695.99411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882695.99450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882695.99480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882695.99502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.99524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882695.99569: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882695.99579: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.99588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882695.99701: Set connection var ansible_shell_executable to /bin/sh 25201 1726882695.99712: Set connection var ansible_pipelining to False 25201 1726882695.99721: Set connection var ansible_connection to ssh 25201 1726882695.99736: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882695.99743: Set connection var ansible_shell_type to sh 25201 1726882695.99754: Set connection var ansible_timeout to 10 25201 1726882695.99783: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.99791: variable 'ansible_connection' from source: unknown 25201 1726882695.99799: variable 'ansible_module_compression' from source: unknown 25201 1726882695.99807: variable 'ansible_shell_type' from source: unknown 25201 1726882695.99813: variable 'ansible_shell_executable' from source: unknown 25201 1726882695.99819: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882695.99826: variable 'ansible_pipelining' from source: unknown 25201 1726882695.99832: variable 'ansible_timeout' from source: unknown 25201 1726882695.99845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.00059: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882696.00081: variable 'omit' from source: magic vars 25201 1726882696.00090: starting attempt loop 25201 1726882696.00097: running the handler 25201 1726882696.00115: _low_level_execute_command(): starting 25201 1726882696.00127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882696.00903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882696.00917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.00937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.00956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.01003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.01018: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882696.01037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.01058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882696.01076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882696.01089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882696.01103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.01116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.01136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.01176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.01189: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882696.01205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.01303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.01321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.01335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.01516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.03185: stdout chunk (state=3): >>>/root <<< 25201 1726882696.03285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.03468: stderr chunk (state=3): >>><<< 25201 1726882696.03473: stdout chunk (state=3): >>><<< 25201 1726882696.03478: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.03480: _low_level_execute_command(): starting 25201 1726882696.03482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281 `" && echo ansible-tmp-1726882696.03375-25936-198965356935281="` echo /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281 `" ) && sleep 0' 25201 1726882696.04437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882696.04455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.04474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.04493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.04604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.04620: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882696.04634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.04651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882696.04667: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882696.04682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882696.04695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.04709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.04730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.04742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.04754: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882696.04772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.04849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.04871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.04887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.05039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.06905: stdout chunk (state=3): >>>ansible-tmp-1726882696.03375-25936-198965356935281=/root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281 <<< 25201 1726882696.07021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.07062: stderr chunk (state=3): >>><<< 25201 1726882696.07069: stdout chunk (state=3): >>><<< 25201 1726882696.07082: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882696.03375-25936-198965356935281=/root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.07116: variable 'ansible_module_compression' from source: unknown 25201 1726882696.07153: ANSIBALLZ: Using lock for ping 25201 1726882696.07156: ANSIBALLZ: Acquiring lock 25201 1726882696.07158: ANSIBALLZ: Lock acquired: 140300034012176 25201 1726882696.07161: ANSIBALLZ: Creating module 25201 1726882696.17871: ANSIBALLZ: Writing module into payload 25201 1726882696.17915: ANSIBALLZ: Writing module 25201 1726882696.17940: ANSIBALLZ: Renaming module 25201 1726882696.17950: ANSIBALLZ: Done creating module 25201 1726882696.17982: variable 'ansible_facts' from source: unknown 25201 1726882696.18060: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/AnsiballZ_ping.py 25201 1726882696.18187: Sending initial data 25201 1726882696.18190: Sent initial data (151 bytes) 25201 1726882696.19175: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882696.19187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.19198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.19213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.19252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.19271: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882696.19289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.19309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882696.19317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882696.19324: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882696.19331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.19340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.19357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.19360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.19377: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882696.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.19518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.19536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.19549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.19681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.21546: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882696.21636: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882696.21735: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpgsvwh7ne /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/AnsiballZ_ping.py <<< 25201 1726882696.21828: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882696.22996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.23087: stderr chunk (state=3): >>><<< 25201 1726882696.23090: stdout chunk (state=3): >>><<< 25201 1726882696.23104: done transferring module to remote 25201 1726882696.23113: _low_level_execute_command(): starting 25201 1726882696.23117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/ /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/AnsiballZ_ping.py && sleep 0' 25201 1726882696.23540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.23543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.23587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.23590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.23593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.23595: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.23638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.23641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.23745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.25501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.25546: stderr chunk (state=3): >>><<< 25201 1726882696.25549: stdout chunk (state=3): >>><<< 25201 1726882696.25561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.25569: _low_level_execute_command(): starting 25201 1726882696.25571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/AnsiballZ_ping.py && sleep 0' 25201 1726882696.25969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.25975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.26015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.26018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.26020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.26071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.26075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.26088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.26200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.39170: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25201 1726882696.40189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882696.40245: stderr chunk (state=3): >>><<< 25201 1726882696.40249: stdout chunk (state=3): >>><<< 25201 1726882696.40262: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882696.40292: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882696.40298: _low_level_execute_command(): starting 25201 1726882696.40303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882696.03375-25936-198965356935281/ > /dev/null 2>&1 && sleep 0' 25201 1726882696.40777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.40781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.40823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882696.40827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.40829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882696.40831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.40885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.40888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.40894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.40992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.42799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.42840: stderr chunk (state=3): >>><<< 25201 1726882696.42844: stdout chunk (state=3): >>><<< 25201 1726882696.42856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.42862: handler run complete 25201 1726882696.42881: attempt loop complete, returning result 25201 1726882696.42884: _execute() done 25201 1726882696.42886: dumping result to json 25201 1726882696.42888: done dumping result, returning 25201 1726882696.42895: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-313b-197e-00000000002c] 25201 1726882696.42900: sending task result for task 0e448fcc-3ce9-313b-197e-00000000002c 25201 1726882696.42983: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000002c 25201 1726882696.42986: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 25201 1726882696.43040: no more pending results, returning what we have 25201 1726882696.43043: results queue empty 25201 1726882696.43044: checking for any_errors_fatal 25201 1726882696.43048: done checking for any_errors_fatal 25201 1726882696.43049: checking for max_fail_percentage 25201 1726882696.43050: done checking for max_fail_percentage 25201 1726882696.43051: checking to see if all hosts have failed and the running result is not ok 25201 1726882696.43052: done checking to see if all hosts have failed 25201 1726882696.43052: getting the remaining hosts for this loop 25201 1726882696.43054: done getting the remaining hosts for this loop 25201 1726882696.43057: getting the next task for host managed_node2 25201 1726882696.43068: done getting next task for host managed_node2 25201 1726882696.43070: ^ task is: TASK: meta (role_complete) 25201 1726882696.43073: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882696.43083: getting variables 25201 1726882696.43084: in VariableManager get_vars() 25201 1726882696.43126: Calling all_inventory to load vars for managed_node2 25201 1726882696.43129: Calling groups_inventory to load vars for managed_node2 25201 1726882696.43132: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.43142: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.43144: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.43147: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.44073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.44992: done with get_vars() 25201 1726882696.45009: done getting variables 25201 1726882696.45066: done queuing things up, now waiting for results queue to drain 25201 1726882696.45068: results queue empty 25201 1726882696.45068: checking for any_errors_fatal 25201 1726882696.45070: done checking for any_errors_fatal 25201 1726882696.45070: checking for max_fail_percentage 25201 1726882696.45071: done checking for max_fail_percentage 25201 1726882696.45071: checking to see if all hosts have failed and the running result is not ok 25201 1726882696.45072: done checking to see if all hosts have failed 25201 1726882696.45073: getting the remaining hosts for this loop 25201 1726882696.45073: done getting the remaining hosts for this loop 25201 1726882696.45075: getting the next task for host managed_node2 25201 1726882696.45078: done getting next task for host managed_node2 25201 1726882696.45079: ^ task is: TASK: Include the task 'assert_device_present.yml' 25201 1726882696.45080: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882696.45082: getting variables 25201 1726882696.45082: in VariableManager get_vars() 25201 1726882696.45092: Calling all_inventory to load vars for managed_node2 25201 1726882696.45093: Calling groups_inventory to load vars for managed_node2 25201 1726882696.45094: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.45097: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.45099: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.45100: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.45809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.46709: done with get_vars() 25201 1726882696.46723: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Friday 20 September 2024 21:38:16 -0400 (0:00:00.490) 0:00:17.642 ****** 25201 1726882696.46777: entering _queue_task() for managed_node2/include_tasks 25201 1726882696.46982: worker is 1 (out of 1 available) 25201 1726882696.46995: exiting _queue_task() for managed_node2/include_tasks 25201 1726882696.47008: done queuing things up, now waiting for results queue to drain 25201 1726882696.47009: waiting for pending results... 25201 1726882696.47180: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 25201 1726882696.47249: in run() - task 0e448fcc-3ce9-313b-197e-00000000005c 25201 1726882696.47259: variable 'ansible_search_path' from source: unknown 25201 1726882696.47294: calling self._execute() 25201 1726882696.47354: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882696.47358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.47371: variable 'omit' from source: magic vars 25201 1726882696.47636: variable 'ansible_distribution_major_version' from source: facts 25201 1726882696.47647: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882696.47652: _execute() done 25201 1726882696.47655: dumping result to json 25201 1726882696.47659: done dumping result, returning 25201 1726882696.47665: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-313b-197e-00000000005c] 25201 1726882696.47675: sending task result for task 0e448fcc-3ce9-313b-197e-00000000005c 25201 1726882696.47756: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000005c 25201 1726882696.47758: WORKER PROCESS EXITING 25201 1726882696.47799: no more pending results, returning what we have 25201 1726882696.47804: in VariableManager get_vars() 25201 1726882696.47840: Calling all_inventory to load vars for managed_node2 25201 1726882696.47843: Calling groups_inventory to load vars for managed_node2 25201 1726882696.47845: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.47854: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.47856: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.47858: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.48614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.49531: done with get_vars() 25201 1726882696.49545: variable 'ansible_search_path' from source: unknown 25201 1726882696.49554: we have included files to process 25201 1726882696.49555: generating all_blocks data 25201 1726882696.49556: done generating all_blocks data 25201 1726882696.49560: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25201 1726882696.49561: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25201 1726882696.49562: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25201 1726882696.49669: in VariableManager get_vars() 25201 1726882696.49684: done with get_vars() 25201 1726882696.49755: done processing included file 25201 1726882696.49756: iterating over new_blocks loaded from include file 25201 1726882696.49757: in VariableManager get_vars() 25201 1726882696.49770: done with get_vars() 25201 1726882696.49771: filtering new block on tags 25201 1726882696.49784: done filtering new block on tags 25201 1726882696.49785: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 25201 1726882696.49788: extending task lists for all hosts with included blocks 25201 1726882696.50967: done extending task lists 25201 1726882696.50968: done processing included files 25201 1726882696.50968: results queue empty 25201 1726882696.50969: checking for any_errors_fatal 25201 1726882696.50970: done checking for any_errors_fatal 25201 1726882696.50970: checking for max_fail_percentage 25201 1726882696.50971: done checking for max_fail_percentage 25201 1726882696.50972: checking to see if all hosts have failed and the running result is not ok 25201 1726882696.50972: done checking to see if all hosts have failed 25201 1726882696.50972: getting the remaining hosts for this loop 25201 1726882696.50973: done getting the remaining hosts for this loop 25201 1726882696.50975: getting the next task for host managed_node2 25201 1726882696.50977: done getting next task for host managed_node2 25201 1726882696.50978: ^ task is: TASK: Include the task 'get_interface_stat.yml' 25201 1726882696.50980: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882696.50981: getting variables 25201 1726882696.50982: in VariableManager get_vars() 25201 1726882696.50990: Calling all_inventory to load vars for managed_node2 25201 1726882696.50991: Calling groups_inventory to load vars for managed_node2 25201 1726882696.50993: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.50996: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.50997: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.50999: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.51665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.52560: done with get_vars() 25201 1726882696.52576: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:38:16 -0400 (0:00:00.058) 0:00:17.700 ****** 25201 1726882696.52624: entering _queue_task() for managed_node2/include_tasks 25201 1726882696.52833: worker is 1 (out of 1 available) 25201 1726882696.52845: exiting _queue_task() for managed_node2/include_tasks 25201 1726882696.52857: done queuing things up, now waiting for results queue to drain 25201 1726882696.52859: waiting for pending results... 25201 1726882696.53028: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 25201 1726882696.53100: in run() - task 0e448fcc-3ce9-313b-197e-0000000002b5 25201 1726882696.53110: variable 'ansible_search_path' from source: unknown 25201 1726882696.53113: variable 'ansible_search_path' from source: unknown 25201 1726882696.53140: calling self._execute() 25201 1726882696.53213: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882696.53217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.53225: variable 'omit' from source: magic vars 25201 1726882696.53488: variable 'ansible_distribution_major_version' from source: facts 25201 1726882696.53498: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882696.53503: _execute() done 25201 1726882696.53511: dumping result to json 25201 1726882696.53517: done dumping result, returning 25201 1726882696.53522: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-313b-197e-0000000002b5] 25201 1726882696.53530: sending task result for task 0e448fcc-3ce9-313b-197e-0000000002b5 25201 1726882696.53610: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000002b5 25201 1726882696.53613: WORKER PROCESS EXITING 25201 1726882696.53656: no more pending results, returning what we have 25201 1726882696.53661: in VariableManager get_vars() 25201 1726882696.53702: Calling all_inventory to load vars for managed_node2 25201 1726882696.53705: Calling groups_inventory to load vars for managed_node2 25201 1726882696.53707: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.53716: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.53718: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.53721: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.54534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.58241: done with get_vars() 25201 1726882696.58254: variable 'ansible_search_path' from source: unknown 25201 1726882696.58255: variable 'ansible_search_path' from source: unknown 25201 1726882696.58285: we have included files to process 25201 1726882696.58286: generating all_blocks data 25201 1726882696.58287: done generating all_blocks data 25201 1726882696.58288: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25201 1726882696.58288: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25201 1726882696.58289: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25201 1726882696.58423: done processing included file 25201 1726882696.58424: iterating over new_blocks loaded from include file 25201 1726882696.58425: in VariableManager get_vars() 25201 1726882696.58437: done with get_vars() 25201 1726882696.58438: filtering new block on tags 25201 1726882696.58446: done filtering new block on tags 25201 1726882696.58448: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 25201 1726882696.58451: extending task lists for all hosts with included blocks 25201 1726882696.58514: done extending task lists 25201 1726882696.58515: done processing included files 25201 1726882696.58515: results queue empty 25201 1726882696.58516: checking for any_errors_fatal 25201 1726882696.58517: done checking for any_errors_fatal 25201 1726882696.58518: checking for max_fail_percentage 25201 1726882696.58518: done checking for max_fail_percentage 25201 1726882696.58519: checking to see if all hosts have failed and the running result is not ok 25201 1726882696.58519: done checking to see if all hosts have failed 25201 1726882696.58520: getting the remaining hosts for this loop 25201 1726882696.58520: done getting the remaining hosts for this loop 25201 1726882696.58522: getting the next task for host managed_node2 25201 1726882696.58524: done getting next task for host managed_node2 25201 1726882696.58525: ^ task is: TASK: Get stat for interface {{ interface }} 25201 1726882696.58527: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882696.58528: getting variables 25201 1726882696.58529: in VariableManager get_vars() 25201 1726882696.58537: Calling all_inventory to load vars for managed_node2 25201 1726882696.58539: Calling groups_inventory to load vars for managed_node2 25201 1726882696.58540: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.58543: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.58544: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.58546: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.59199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882696.60091: done with get_vars() 25201 1726882696.60104: done getting variables 25201 1726882696.60199: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:16 -0400 (0:00:00.075) 0:00:17.776 ****** 25201 1726882696.60216: entering _queue_task() for managed_node2/stat 25201 1726882696.60442: worker is 1 (out of 1 available) 25201 1726882696.60455: exiting _queue_task() for managed_node2/stat 25201 1726882696.60467: done queuing things up, now waiting for results queue to drain 25201 1726882696.60470: waiting for pending results... 25201 1726882696.60645: running TaskExecutor() for managed_node2/TASK: Get stat for interface veth0 25201 1726882696.60717: in run() - task 0e448fcc-3ce9-313b-197e-0000000003a0 25201 1726882696.60725: variable 'ansible_search_path' from source: unknown 25201 1726882696.60729: variable 'ansible_search_path' from source: unknown 25201 1726882696.60755: calling self._execute() 25201 1726882696.60825: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882696.60829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.60838: variable 'omit' from source: magic vars 25201 1726882696.61104: variable 'ansible_distribution_major_version' from source: facts 25201 1726882696.61114: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882696.61121: variable 'omit' from source: magic vars 25201 1726882696.61150: variable 'omit' from source: magic vars 25201 1726882696.61220: variable 'interface' from source: play vars 25201 1726882696.61232: variable 'omit' from source: magic vars 25201 1726882696.61267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882696.61294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882696.61310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882696.61324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882696.61335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882696.61359: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882696.61362: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882696.61369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.61435: Set connection var ansible_shell_executable to /bin/sh 25201 1726882696.61439: Set connection var ansible_pipelining to False 25201 1726882696.61442: Set connection var ansible_connection to ssh 25201 1726882696.61448: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882696.61455: Set connection var ansible_shell_type to sh 25201 1726882696.61462: Set connection var ansible_timeout to 10 25201 1726882696.61484: variable 'ansible_shell_executable' from source: unknown 25201 1726882696.61487: variable 'ansible_connection' from source: unknown 25201 1726882696.61490: variable 'ansible_module_compression' from source: unknown 25201 1726882696.61492: variable 'ansible_shell_type' from source: unknown 25201 1726882696.61494: variable 'ansible_shell_executable' from source: unknown 25201 1726882696.61497: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882696.61501: variable 'ansible_pipelining' from source: unknown 25201 1726882696.61503: variable 'ansible_timeout' from source: unknown 25201 1726882696.61507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882696.61646: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882696.61655: variable 'omit' from source: magic vars 25201 1726882696.61660: starting attempt loop 25201 1726882696.61663: running the handler 25201 1726882696.61679: _low_level_execute_command(): starting 25201 1726882696.61686: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882696.62203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.62225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.62242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.62255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.62302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.62314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.62431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.64082: stdout chunk (state=3): >>>/root <<< 25201 1726882696.64188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.64236: stderr chunk (state=3): >>><<< 25201 1726882696.64242: stdout chunk (state=3): >>><<< 25201 1726882696.64262: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.64280: _low_level_execute_command(): starting 25201 1726882696.64284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313 `" && echo ansible-tmp-1726882696.6426735-25955-193046000513313="` echo /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313 `" ) && sleep 0' 25201 1726882696.64716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.64720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.64752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.64769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.64818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.64829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.64941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.66822: stdout chunk (state=3): >>>ansible-tmp-1726882696.6426735-25955-193046000513313=/root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313 <<< 25201 1726882696.66932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.66981: stderr chunk (state=3): >>><<< 25201 1726882696.66984: stdout chunk (state=3): >>><<< 25201 1726882696.66998: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882696.6426735-25955-193046000513313=/root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.67035: variable 'ansible_module_compression' from source: unknown 25201 1726882696.67084: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25201 1726882696.67116: variable 'ansible_facts' from source: unknown 25201 1726882696.67167: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/AnsiballZ_stat.py 25201 1726882696.67262: Sending initial data 25201 1726882696.67269: Sent initial data (153 bytes) 25201 1726882696.67899: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.67905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.67936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.67959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.67962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882696.67969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.68012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.68018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.68131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.69875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882696.69968: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882696.70069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpfie9iu5t /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/AnsiballZ_stat.py <<< 25201 1726882696.70200: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882696.71820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.71892: stderr chunk (state=3): >>><<< 25201 1726882696.71895: stdout chunk (state=3): >>><<< 25201 1726882696.71915: done transferring module to remote 25201 1726882696.71926: _low_level_execute_command(): starting 25201 1726882696.71930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/ /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/AnsiballZ_stat.py && sleep 0' 25201 1726882696.73317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882696.73700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.73711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.73724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.73762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.73773: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882696.73784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.73797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882696.73804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882696.73810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882696.73818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.73827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.73838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.73845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.73851: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882696.73860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.73938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.73952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.73962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.74114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.75945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.75949: stdout chunk (state=3): >>><<< 25201 1726882696.75955: stderr chunk (state=3): >>><<< 25201 1726882696.75980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.75984: _low_level_execute_command(): starting 25201 1726882696.75988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/AnsiballZ_stat.py && sleep 0' 25201 1726882696.76616: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882696.76631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.76641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.76653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.76696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.76702: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882696.76712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.76726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882696.76740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882696.76746: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882696.76754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.76762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.76780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.76787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882696.76794: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882696.76802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.76883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882696.76899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.76910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.77270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.90129: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30807, "dev": 21, "nlink": 1, "atime": 1726882686.4703038, "mtime": 1726882686.4703038, "ctime": 1726882686.4703038, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25201 1726882696.91178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882696.91182: stdout chunk (state=3): >>><<< 25201 1726882696.91188: stderr chunk (state=3): >>><<< 25201 1726882696.91213: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30807, "dev": 21, "nlink": 1, "atime": 1726882686.4703038, "mtime": 1726882686.4703038, "ctime": 1726882686.4703038, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882696.91272: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882696.91281: _low_level_execute_command(): starting 25201 1726882696.91287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882696.6426735-25955-193046000513313/ > /dev/null 2>&1 && sleep 0' 25201 1726882696.92505: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.92511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882696.93207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882696.93213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882696.93229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882696.93234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882696.93317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882696.93356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882696.93451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882696.95376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882696.95380: stderr chunk (state=3): >>><<< 25201 1726882696.95383: stdout chunk (state=3): >>><<< 25201 1726882696.95405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882696.95409: handler run complete 25201 1726882696.95460: attempt loop complete, returning result 25201 1726882696.95465: _execute() done 25201 1726882696.95468: dumping result to json 25201 1726882696.95482: done dumping result, returning 25201 1726882696.95491: done running TaskExecutor() for managed_node2/TASK: Get stat for interface veth0 [0e448fcc-3ce9-313b-197e-0000000003a0] 25201 1726882696.95496: sending task result for task 0e448fcc-3ce9-313b-197e-0000000003a0 25201 1726882696.95612: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000003a0 25201 1726882696.95614: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882686.4703038, "block_size": 4096, "blocks": 0, "ctime": 1726882686.4703038, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30807, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882686.4703038, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 25201 1726882696.95778: no more pending results, returning what we have 25201 1726882696.95783: results queue empty 25201 1726882696.95784: checking for any_errors_fatal 25201 1726882696.95786: done checking for any_errors_fatal 25201 1726882696.95787: checking for max_fail_percentage 25201 1726882696.95789: done checking for max_fail_percentage 25201 1726882696.95790: checking to see if all hosts have failed and the running result is not ok 25201 1726882696.95791: done checking to see if all hosts have failed 25201 1726882696.95792: getting the remaining hosts for this loop 25201 1726882696.95794: done getting the remaining hosts for this loop 25201 1726882696.95799: getting the next task for host managed_node2 25201 1726882696.95808: done getting next task for host managed_node2 25201 1726882696.95810: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 25201 1726882696.95812: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882696.95817: getting variables 25201 1726882696.95819: in VariableManager get_vars() 25201 1726882696.95860: Calling all_inventory to load vars for managed_node2 25201 1726882696.95867: Calling groups_inventory to load vars for managed_node2 25201 1726882696.95869: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882696.95882: Calling all_plugins_play to load vars for managed_node2 25201 1726882696.95885: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882696.95889: Calling groups_plugins_play to load vars for managed_node2 25201 1726882696.99947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.01845: done with get_vars() 25201 1726882697.01871: done getting variables 25201 1726882697.01972: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 25201 1726882697.02091: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:38:17 -0400 (0:00:00.419) 0:00:18.195 ****** 25201 1726882697.02122: entering _queue_task() for managed_node2/assert 25201 1726882697.02124: Creating lock for assert 25201 1726882697.02434: worker is 1 (out of 1 available) 25201 1726882697.02446: exiting _queue_task() for managed_node2/assert 25201 1726882697.02458: done queuing things up, now waiting for results queue to drain 25201 1726882697.02460: waiting for pending results... 25201 1726882697.03261: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'veth0' 25201 1726882697.03412: in run() - task 0e448fcc-3ce9-313b-197e-0000000002b6 25201 1726882697.03442: variable 'ansible_search_path' from source: unknown 25201 1726882697.03450: variable 'ansible_search_path' from source: unknown 25201 1726882697.03496: calling self._execute() 25201 1726882697.03608: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.03624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.03648: variable 'omit' from source: magic vars 25201 1726882697.04072: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.04095: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.04107: variable 'omit' from source: magic vars 25201 1726882697.04149: variable 'omit' from source: magic vars 25201 1726882697.04255: variable 'interface' from source: play vars 25201 1726882697.04285: variable 'omit' from source: magic vars 25201 1726882697.04335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882697.04379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882697.04411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882697.04434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.04453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.04492: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882697.04505: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.04519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.04634: Set connection var ansible_shell_executable to /bin/sh 25201 1726882697.04646: Set connection var ansible_pipelining to False 25201 1726882697.04655: Set connection var ansible_connection to ssh 25201 1726882697.04667: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882697.04676: Set connection var ansible_shell_type to sh 25201 1726882697.04688: Set connection var ansible_timeout to 10 25201 1726882697.04721: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.04734: variable 'ansible_connection' from source: unknown 25201 1726882697.04742: variable 'ansible_module_compression' from source: unknown 25201 1726882697.04748: variable 'ansible_shell_type' from source: unknown 25201 1726882697.04755: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.04762: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.04772: variable 'ansible_pipelining' from source: unknown 25201 1726882697.04779: variable 'ansible_timeout' from source: unknown 25201 1726882697.04787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.04934: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882697.04956: variable 'omit' from source: magic vars 25201 1726882697.04967: starting attempt loop 25201 1726882697.04974: running the handler 25201 1726882697.05116: variable 'interface_stat' from source: set_fact 25201 1726882697.05140: Evaluated conditional (interface_stat.stat.exists): True 25201 1726882697.05150: handler run complete 25201 1726882697.05179: attempt loop complete, returning result 25201 1726882697.05186: _execute() done 25201 1726882697.05192: dumping result to json 25201 1726882697.05199: done dumping result, returning 25201 1726882697.05210: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'veth0' [0e448fcc-3ce9-313b-197e-0000000002b6] 25201 1726882697.05221: sending task result for task 0e448fcc-3ce9-313b-197e-0000000002b6 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882697.05369: no more pending results, returning what we have 25201 1726882697.05372: results queue empty 25201 1726882697.05373: checking for any_errors_fatal 25201 1726882697.05382: done checking for any_errors_fatal 25201 1726882697.05382: checking for max_fail_percentage 25201 1726882697.05384: done checking for max_fail_percentage 25201 1726882697.05385: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.05385: done checking to see if all hosts have failed 25201 1726882697.05386: getting the remaining hosts for this loop 25201 1726882697.05387: done getting the remaining hosts for this loop 25201 1726882697.05391: getting the next task for host managed_node2 25201 1726882697.05399: done getting next task for host managed_node2 25201 1726882697.05401: ^ task is: TASK: Include the task 'assert_profile_present.yml' 25201 1726882697.05403: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.05406: getting variables 25201 1726882697.05407: in VariableManager get_vars() 25201 1726882697.05468: Calling all_inventory to load vars for managed_node2 25201 1726882697.05471: Calling groups_inventory to load vars for managed_node2 25201 1726882697.05474: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.05485: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.05487: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.05491: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.06548: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000002b6 25201 1726882697.06552: WORKER PROCESS EXITING 25201 1726882697.07793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.10291: done with get_vars() 25201 1726882697.10313: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Friday 20 September 2024 21:38:17 -0400 (0:00:00.082) 0:00:18.278 ****** 25201 1726882697.10402: entering _queue_task() for managed_node2/include_tasks 25201 1726882697.11004: worker is 1 (out of 1 available) 25201 1726882697.11018: exiting _queue_task() for managed_node2/include_tasks 25201 1726882697.11030: done queuing things up, now waiting for results queue to drain 25201 1726882697.11032: waiting for pending results... 25201 1726882697.11317: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 25201 1726882697.11403: in run() - task 0e448fcc-3ce9-313b-197e-00000000005d 25201 1726882697.11414: variable 'ansible_search_path' from source: unknown 25201 1726882697.11448: calling self._execute() 25201 1726882697.11539: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.11544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.11555: variable 'omit' from source: magic vars 25201 1726882697.11922: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.11934: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.11940: _execute() done 25201 1726882697.11943: dumping result to json 25201 1726882697.11946: done dumping result, returning 25201 1726882697.11953: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-313b-197e-00000000005d] 25201 1726882697.11958: sending task result for task 0e448fcc-3ce9-313b-197e-00000000005d 25201 1726882697.12051: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000005d 25201 1726882697.12054: WORKER PROCESS EXITING 25201 1726882697.12080: no more pending results, returning what we have 25201 1726882697.12085: in VariableManager get_vars() 25201 1726882697.12127: Calling all_inventory to load vars for managed_node2 25201 1726882697.12130: Calling groups_inventory to load vars for managed_node2 25201 1726882697.12132: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.12142: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.12145: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.12147: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.13550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.15886: done with get_vars() 25201 1726882697.15907: variable 'ansible_search_path' from source: unknown 25201 1726882697.15922: we have included files to process 25201 1726882697.15923: generating all_blocks data 25201 1726882697.15925: done generating all_blocks data 25201 1726882697.15931: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25201 1726882697.15932: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25201 1726882697.15935: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25201 1726882697.16399: in VariableManager get_vars() 25201 1726882697.16616: done with get_vars() 25201 1726882697.16927: done processing included file 25201 1726882697.16928: iterating over new_blocks loaded from include file 25201 1726882697.16930: in VariableManager get_vars() 25201 1726882697.16951: done with get_vars() 25201 1726882697.16957: filtering new block on tags 25201 1726882697.17002: done filtering new block on tags 25201 1726882697.17005: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 25201 1726882697.17010: extending task lists for all hosts with included blocks 25201 1726882697.19754: done extending task lists 25201 1726882697.19756: done processing included files 25201 1726882697.19757: results queue empty 25201 1726882697.19757: checking for any_errors_fatal 25201 1726882697.19760: done checking for any_errors_fatal 25201 1726882697.19761: checking for max_fail_percentage 25201 1726882697.19762: done checking for max_fail_percentage 25201 1726882697.19766: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.19767: done checking to see if all hosts have failed 25201 1726882697.19767: getting the remaining hosts for this loop 25201 1726882697.19769: done getting the remaining hosts for this loop 25201 1726882697.19771: getting the next task for host managed_node2 25201 1726882697.19774: done getting next task for host managed_node2 25201 1726882697.19776: ^ task is: TASK: Include the task 'get_profile_stat.yml' 25201 1726882697.19778: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.19780: getting variables 25201 1726882697.19781: in VariableManager get_vars() 25201 1726882697.19908: Calling all_inventory to load vars for managed_node2 25201 1726882697.19910: Calling groups_inventory to load vars for managed_node2 25201 1726882697.19912: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.19918: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.19920: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.19923: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.21993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.23749: done with get_vars() 25201 1726882697.23774: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.134) 0:00:18.413 ****** 25201 1726882697.23856: entering _queue_task() for managed_node2/include_tasks 25201 1726882697.24195: worker is 1 (out of 1 available) 25201 1726882697.24207: exiting _queue_task() for managed_node2/include_tasks 25201 1726882697.24219: done queuing things up, now waiting for results queue to drain 25201 1726882697.24221: waiting for pending results... 25201 1726882697.24520: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 25201 1726882697.24630: in run() - task 0e448fcc-3ce9-313b-197e-0000000003b8 25201 1726882697.24650: variable 'ansible_search_path' from source: unknown 25201 1726882697.24660: variable 'ansible_search_path' from source: unknown 25201 1726882697.24709: calling self._execute() 25201 1726882697.24806: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.24819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.24834: variable 'omit' from source: magic vars 25201 1726882697.25222: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.25238: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.25247: _execute() done 25201 1726882697.25253: dumping result to json 25201 1726882697.25259: done dumping result, returning 25201 1726882697.25272: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-313b-197e-0000000003b8] 25201 1726882697.25280: sending task result for task 0e448fcc-3ce9-313b-197e-0000000003b8 25201 1726882697.25394: no more pending results, returning what we have 25201 1726882697.25399: in VariableManager get_vars() 25201 1726882697.25443: Calling all_inventory to load vars for managed_node2 25201 1726882697.25446: Calling groups_inventory to load vars for managed_node2 25201 1726882697.25449: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.25467: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.25471: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.25475: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.26544: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000003b8 25201 1726882697.26547: WORKER PROCESS EXITING 25201 1726882697.27340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.29103: done with get_vars() 25201 1726882697.29121: variable 'ansible_search_path' from source: unknown 25201 1726882697.29127: variable 'ansible_search_path' from source: unknown 25201 1726882697.29161: we have included files to process 25201 1726882697.29162: generating all_blocks data 25201 1726882697.29167: done generating all_blocks data 25201 1726882697.29169: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25201 1726882697.29170: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25201 1726882697.29172: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25201 1726882697.30227: done processing included file 25201 1726882697.30229: iterating over new_blocks loaded from include file 25201 1726882697.30230: in VariableManager get_vars() 25201 1726882697.30248: done with get_vars() 25201 1726882697.30250: filtering new block on tags 25201 1726882697.30277: done filtering new block on tags 25201 1726882697.30280: in VariableManager get_vars() 25201 1726882697.30297: done with get_vars() 25201 1726882697.30299: filtering new block on tags 25201 1726882697.30324: done filtering new block on tags 25201 1726882697.30327: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 25201 1726882697.30331: extending task lists for all hosts with included blocks 25201 1726882697.30503: done extending task lists 25201 1726882697.30504: done processing included files 25201 1726882697.30505: results queue empty 25201 1726882697.30506: checking for any_errors_fatal 25201 1726882697.30509: done checking for any_errors_fatal 25201 1726882697.30509: checking for max_fail_percentage 25201 1726882697.30510: done checking for max_fail_percentage 25201 1726882697.30511: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.30512: done checking to see if all hosts have failed 25201 1726882697.30513: getting the remaining hosts for this loop 25201 1726882697.30514: done getting the remaining hosts for this loop 25201 1726882697.30516: getting the next task for host managed_node2 25201 1726882697.30520: done getting next task for host managed_node2 25201 1726882697.30523: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 25201 1726882697.30525: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.30528: getting variables 25201 1726882697.30528: in VariableManager get_vars() 25201 1726882697.30587: Calling all_inventory to load vars for managed_node2 25201 1726882697.30589: Calling groups_inventory to load vars for managed_node2 25201 1726882697.30591: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.30596: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.30599: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.30601: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.31809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.33540: done with get_vars() 25201 1726882697.33560: done getting variables 25201 1726882697.33603: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.097) 0:00:18.510 ****** 25201 1726882697.33630: entering _queue_task() for managed_node2/set_fact 25201 1726882697.33929: worker is 1 (out of 1 available) 25201 1726882697.33940: exiting _queue_task() for managed_node2/set_fact 25201 1726882697.33952: done queuing things up, now waiting for results queue to drain 25201 1726882697.33953: waiting for pending results... 25201 1726882697.34230: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 25201 1726882697.34356: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b0 25201 1726882697.34383: variable 'ansible_search_path' from source: unknown 25201 1726882697.34393: variable 'ansible_search_path' from source: unknown 25201 1726882697.34433: calling self._execute() 25201 1726882697.34529: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.34541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.34556: variable 'omit' from source: magic vars 25201 1726882697.34948: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.34971: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.34983: variable 'omit' from source: magic vars 25201 1726882697.35035: variable 'omit' from source: magic vars 25201 1726882697.35084: variable 'omit' from source: magic vars 25201 1726882697.35131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882697.35180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882697.35207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882697.35235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.35253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.35294: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882697.35303: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.35311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.35427: Set connection var ansible_shell_executable to /bin/sh 25201 1726882697.35438: Set connection var ansible_pipelining to False 25201 1726882697.35453: Set connection var ansible_connection to ssh 25201 1726882697.35467: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882697.35477: Set connection var ansible_shell_type to sh 25201 1726882697.35494: Set connection var ansible_timeout to 10 25201 1726882697.35520: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.35529: variable 'ansible_connection' from source: unknown 25201 1726882697.35537: variable 'ansible_module_compression' from source: unknown 25201 1726882697.35543: variable 'ansible_shell_type' from source: unknown 25201 1726882697.35550: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.35567: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.35577: variable 'ansible_pipelining' from source: unknown 25201 1726882697.35584: variable 'ansible_timeout' from source: unknown 25201 1726882697.35593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.35738: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882697.35755: variable 'omit' from source: magic vars 25201 1726882697.35768: starting attempt loop 25201 1726882697.35777: running the handler 25201 1726882697.35798: handler run complete 25201 1726882697.35816: attempt loop complete, returning result 25201 1726882697.35823: _execute() done 25201 1726882697.35830: dumping result to json 25201 1726882697.35837: done dumping result, returning 25201 1726882697.35848: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-313b-197e-0000000004b0] 25201 1726882697.35858: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b0 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 25201 1726882697.36008: no more pending results, returning what we have 25201 1726882697.36012: results queue empty 25201 1726882697.36013: checking for any_errors_fatal 25201 1726882697.36015: done checking for any_errors_fatal 25201 1726882697.36015: checking for max_fail_percentage 25201 1726882697.36017: done checking for max_fail_percentage 25201 1726882697.36018: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.36019: done checking to see if all hosts have failed 25201 1726882697.36020: getting the remaining hosts for this loop 25201 1726882697.36022: done getting the remaining hosts for this loop 25201 1726882697.36025: getting the next task for host managed_node2 25201 1726882697.36033: done getting next task for host managed_node2 25201 1726882697.36036: ^ task is: TASK: Stat profile file 25201 1726882697.36040: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.36045: getting variables 25201 1726882697.36047: in VariableManager get_vars() 25201 1726882697.36093: Calling all_inventory to load vars for managed_node2 25201 1726882697.36100: Calling groups_inventory to load vars for managed_node2 25201 1726882697.36103: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.36118: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.36123: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.36127: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.37136: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b0 25201 1726882697.37139: WORKER PROCESS EXITING 25201 1726882697.38247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.41042: done with get_vars() 25201 1726882697.41073: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:38:17 -0400 (0:00:00.075) 0:00:18.586 ****** 25201 1726882697.41155: entering _queue_task() for managed_node2/stat 25201 1726882697.41455: worker is 1 (out of 1 available) 25201 1726882697.41473: exiting _queue_task() for managed_node2/stat 25201 1726882697.41491: done queuing things up, now waiting for results queue to drain 25201 1726882697.41493: waiting for pending results... 25201 1726882697.41796: running TaskExecutor() for managed_node2/TASK: Stat profile file 25201 1726882697.41931: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b1 25201 1726882697.41952: variable 'ansible_search_path' from source: unknown 25201 1726882697.41960: variable 'ansible_search_path' from source: unknown 25201 1726882697.42004: calling self._execute() 25201 1726882697.43451: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.43473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.43491: variable 'omit' from source: magic vars 25201 1726882697.44243: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.44261: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.44278: variable 'omit' from source: magic vars 25201 1726882697.44421: variable 'omit' from source: magic vars 25201 1726882697.44631: variable 'profile' from source: include params 25201 1726882697.44642: variable 'interface' from source: play vars 25201 1726882697.44725: variable 'interface' from source: play vars 25201 1726882697.44835: variable 'omit' from source: magic vars 25201 1726882697.44884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882697.45035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882697.45062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882697.45090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.45107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882697.45257: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882697.45273: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.45282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.45400: Set connection var ansible_shell_executable to /bin/sh 25201 1726882697.45480: Set connection var ansible_pipelining to False 25201 1726882697.45491: Set connection var ansible_connection to ssh 25201 1726882697.45501: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882697.45507: Set connection var ansible_shell_type to sh 25201 1726882697.45518: Set connection var ansible_timeout to 10 25201 1726882697.45606: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.45614: variable 'ansible_connection' from source: unknown 25201 1726882697.45621: variable 'ansible_module_compression' from source: unknown 25201 1726882697.45628: variable 'ansible_shell_type' from source: unknown 25201 1726882697.45635: variable 'ansible_shell_executable' from source: unknown 25201 1726882697.45642: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.45650: variable 'ansible_pipelining' from source: unknown 25201 1726882697.45657: variable 'ansible_timeout' from source: unknown 25201 1726882697.45669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.45999: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882697.46138: variable 'omit' from source: magic vars 25201 1726882697.46147: starting attempt loop 25201 1726882697.46154: running the handler 25201 1726882697.46177: _low_level_execute_command(): starting 25201 1726882697.46246: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882697.48598: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882697.48612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.48630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.48649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.48700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.48711: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882697.48724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.48745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882697.48756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882697.48771: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882697.48786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.48798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.48812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.48823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.48833: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882697.48846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.48930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.48950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.48973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.49201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.50843: stdout chunk (state=3): >>>/root <<< 25201 1726882697.51033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882697.51036: stdout chunk (state=3): >>><<< 25201 1726882697.51039: stderr chunk (state=3): >>><<< 25201 1726882697.51150: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882697.51155: _low_level_execute_command(): starting 25201 1726882697.51157: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827 `" && echo ansible-tmp-1726882697.5105765-25991-76099759223827="` echo /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827 `" ) && sleep 0' 25201 1726882697.52512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.52515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.52628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.52669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882697.52675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.52685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.52829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.52957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.52960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.53079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.54941: stdout chunk (state=3): >>>ansible-tmp-1726882697.5105765-25991-76099759223827=/root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827 <<< 25201 1726882697.55054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882697.55124: stderr chunk (state=3): >>><<< 25201 1726882697.55128: stdout chunk (state=3): >>><<< 25201 1726882697.55456: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882697.5105765-25991-76099759223827=/root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882697.55460: variable 'ansible_module_compression' from source: unknown 25201 1726882697.55466: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25201 1726882697.55470: variable 'ansible_facts' from source: unknown 25201 1726882697.55473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/AnsiballZ_stat.py 25201 1726882697.55913: Sending initial data 25201 1726882697.55916: Sent initial data (152 bytes) 25201 1726882697.58326: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882697.58340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.58356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.58380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.58426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.58439: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882697.58453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.58514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882697.58526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882697.58536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882697.58548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.58560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.58582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.58594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.58607: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882697.58620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.58700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.58838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.58854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.59055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.60971: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882697.60976: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882697.60999: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp8j48yco4 /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/AnsiballZ_stat.py <<< 25201 1726882697.61093: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882697.62574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882697.62646: stderr chunk (state=3): >>><<< 25201 1726882697.62650: stdout chunk (state=3): >>><<< 25201 1726882697.62675: done transferring module to remote 25201 1726882697.62686: _low_level_execute_command(): starting 25201 1726882697.62690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/ /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/AnsiballZ_stat.py && sleep 0' 25201 1726882697.64324: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882697.64446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.64462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.64490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.64535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.64552: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882697.64572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.64592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882697.64605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882697.64616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882697.64629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.64644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.64668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.64683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.64695: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882697.64709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.64903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.64920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.64935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.65103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.66952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882697.66955: stdout chunk (state=3): >>><<< 25201 1726882697.66958: stderr chunk (state=3): >>><<< 25201 1726882697.67053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882697.67056: _low_level_execute_command(): starting 25201 1726882697.67058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/AnsiballZ_stat.py && sleep 0' 25201 1726882697.68498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882697.68617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.68633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.68652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.68699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.68715: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882697.68730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.68748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882697.68760: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882697.68777: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882697.68791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.68805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.68824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.68837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.68848: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882697.68866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.69008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.69053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.69074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.69380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.82357: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25201 1726882697.83410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882697.83461: stderr chunk (state=3): >>><<< 25201 1726882697.83468: stdout chunk (state=3): >>><<< 25201 1726882697.83723: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882697.83728: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882697.83735: _low_level_execute_command(): starting 25201 1726882697.83738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882697.5105765-25991-76099759223827/ > /dev/null 2>&1 && sleep 0' 25201 1726882697.84856: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882697.84875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.84889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.84906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.84946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.84957: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882697.84974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.84992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882697.85002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882697.85012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882697.85023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882697.85034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882697.85048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882697.85058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882697.85078: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882697.85092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882697.85171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882697.85279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882697.85293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882697.85479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882697.87370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882697.87374: stdout chunk (state=3): >>><<< 25201 1726882697.87376: stderr chunk (state=3): >>><<< 25201 1726882697.87580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882697.87584: handler run complete 25201 1726882697.87586: attempt loop complete, returning result 25201 1726882697.87588: _execute() done 25201 1726882697.87590: dumping result to json 25201 1726882697.87592: done dumping result, returning 25201 1726882697.87594: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-313b-197e-0000000004b1] 25201 1726882697.87596: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b1 25201 1726882697.87713: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b1 25201 1726882697.87716: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 25201 1726882697.87780: no more pending results, returning what we have 25201 1726882697.87784: results queue empty 25201 1726882697.87785: checking for any_errors_fatal 25201 1726882697.87792: done checking for any_errors_fatal 25201 1726882697.87793: checking for max_fail_percentage 25201 1726882697.87795: done checking for max_fail_percentage 25201 1726882697.87796: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.87797: done checking to see if all hosts have failed 25201 1726882697.87798: getting the remaining hosts for this loop 25201 1726882697.87799: done getting the remaining hosts for this loop 25201 1726882697.87803: getting the next task for host managed_node2 25201 1726882697.87812: done getting next task for host managed_node2 25201 1726882697.87815: ^ task is: TASK: Set NM profile exist flag based on the profile files 25201 1726882697.87819: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.87824: getting variables 25201 1726882697.87826: in VariableManager get_vars() 25201 1726882697.87876: Calling all_inventory to load vars for managed_node2 25201 1726882697.87880: Calling groups_inventory to load vars for managed_node2 25201 1726882697.87882: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.87894: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.87897: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.87900: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.89805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.92345: done with get_vars() 25201 1726882697.92367: done getting variables 25201 1726882697.92426: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:38:17 -0400 (0:00:00.512) 0:00:19.099 ****** 25201 1726882697.92456: entering _queue_task() for managed_node2/set_fact 25201 1726882697.92720: worker is 1 (out of 1 available) 25201 1726882697.92731: exiting _queue_task() for managed_node2/set_fact 25201 1726882697.92742: done queuing things up, now waiting for results queue to drain 25201 1726882697.92743: waiting for pending results... 25201 1726882697.93003: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 25201 1726882697.93102: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b2 25201 1726882697.93114: variable 'ansible_search_path' from source: unknown 25201 1726882697.93118: variable 'ansible_search_path' from source: unknown 25201 1726882697.93149: calling self._execute() 25201 1726882697.93233: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.93237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.93247: variable 'omit' from source: magic vars 25201 1726882697.93601: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.93617: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882697.93742: variable 'profile_stat' from source: set_fact 25201 1726882697.93753: Evaluated conditional (profile_stat.stat.exists): False 25201 1726882697.93756: when evaluation is False, skipping this task 25201 1726882697.93759: _execute() done 25201 1726882697.93762: dumping result to json 25201 1726882697.93768: done dumping result, returning 25201 1726882697.93771: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-313b-197e-0000000004b2] 25201 1726882697.93777: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b2 25201 1726882697.93922: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b2 25201 1726882697.93925: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25201 1726882697.93979: no more pending results, returning what we have 25201 1726882697.93983: results queue empty 25201 1726882697.93985: checking for any_errors_fatal 25201 1726882697.93994: done checking for any_errors_fatal 25201 1726882697.93995: checking for max_fail_percentage 25201 1726882697.93998: done checking for max_fail_percentage 25201 1726882697.93998: checking to see if all hosts have failed and the running result is not ok 25201 1726882697.93999: done checking to see if all hosts have failed 25201 1726882697.94000: getting the remaining hosts for this loop 25201 1726882697.94002: done getting the remaining hosts for this loop 25201 1726882697.94006: getting the next task for host managed_node2 25201 1726882697.94013: done getting next task for host managed_node2 25201 1726882697.94016: ^ task is: TASK: Get NM profile info 25201 1726882697.94020: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882697.94025: getting variables 25201 1726882697.94027: in VariableManager get_vars() 25201 1726882697.94068: Calling all_inventory to load vars for managed_node2 25201 1726882697.94072: Calling groups_inventory to load vars for managed_node2 25201 1726882697.94074: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882697.94087: Calling all_plugins_play to load vars for managed_node2 25201 1726882697.94090: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882697.94093: Calling groups_plugins_play to load vars for managed_node2 25201 1726882697.95728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882697.98572: done with get_vars() 25201 1726882697.98603: done getting variables 25201 1726882697.98658: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:38:17 -0400 (0:00:00.062) 0:00:19.161 ****** 25201 1726882697.98717: entering _queue_task() for managed_node2/shell 25201 1726882697.99014: worker is 1 (out of 1 available) 25201 1726882697.99030: exiting _queue_task() for managed_node2/shell 25201 1726882697.99042: done queuing things up, now waiting for results queue to drain 25201 1726882697.99044: waiting for pending results... 25201 1726882697.99322: running TaskExecutor() for managed_node2/TASK: Get NM profile info 25201 1726882697.99419: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b3 25201 1726882697.99431: variable 'ansible_search_path' from source: unknown 25201 1726882697.99434: variable 'ansible_search_path' from source: unknown 25201 1726882697.99475: calling self._execute() 25201 1726882697.99565: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882697.99571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882697.99583: variable 'omit' from source: magic vars 25201 1726882697.99981: variable 'ansible_distribution_major_version' from source: facts 25201 1726882697.99994: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.00002: variable 'omit' from source: magic vars 25201 1726882698.00060: variable 'omit' from source: magic vars 25201 1726882698.00173: variable 'profile' from source: include params 25201 1726882698.00176: variable 'interface' from source: play vars 25201 1726882698.00255: variable 'interface' from source: play vars 25201 1726882698.00275: variable 'omit' from source: magic vars 25201 1726882698.00316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.00361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.00382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.00700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.00712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.00742: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.00745: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.00747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.00856: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.00862: Set connection var ansible_pipelining to False 25201 1726882698.00869: Set connection var ansible_connection to ssh 25201 1726882698.00875: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.00879: Set connection var ansible_shell_type to sh 25201 1726882698.00892: Set connection var ansible_timeout to 10 25201 1726882698.00913: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.00917: variable 'ansible_connection' from source: unknown 25201 1726882698.00920: variable 'ansible_module_compression' from source: unknown 25201 1726882698.00922: variable 'ansible_shell_type' from source: unknown 25201 1726882698.00924: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.00926: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.00930: variable 'ansible_pipelining' from source: unknown 25201 1726882698.00933: variable 'ansible_timeout' from source: unknown 25201 1726882698.00937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.01301: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.01313: variable 'omit' from source: magic vars 25201 1726882698.01317: starting attempt loop 25201 1726882698.01548: running the handler 25201 1726882698.01557: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.01646: _low_level_execute_command(): starting 25201 1726882698.01660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882698.03030: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.03042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.03053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.03074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.03106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.03117: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.03128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.03145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.03154: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.03160: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.03170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.03180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.03194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.03199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.03206: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.03217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.03295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.03312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.03320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.03455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.05120: stdout chunk (state=3): >>>/root <<< 25201 1726882698.05271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.05278: stdout chunk (state=3): >>><<< 25201 1726882698.05285: stderr chunk (state=3): >>><<< 25201 1726882698.05310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.05322: _low_level_execute_command(): starting 25201 1726882698.05328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125 `" && echo ansible-tmp-1726882698.0530896-26014-191367721630125="` echo /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125 `" ) && sleep 0' 25201 1726882698.05906: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.05915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.05926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.05941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.05985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.05993: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.06003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.06017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.06024: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.06030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.06037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.06046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.06068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.06074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.06082: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.06092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.06158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.06181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.06191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.06318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.08198: stdout chunk (state=3): >>>ansible-tmp-1726882698.0530896-26014-191367721630125=/root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125 <<< 25201 1726882698.08381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.08385: stdout chunk (state=3): >>><<< 25201 1726882698.08391: stderr chunk (state=3): >>><<< 25201 1726882698.08410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882698.0530896-26014-191367721630125=/root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.08443: variable 'ansible_module_compression' from source: unknown 25201 1726882698.08497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882698.08532: variable 'ansible_facts' from source: unknown 25201 1726882698.08606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/AnsiballZ_command.py 25201 1726882698.08743: Sending initial data 25201 1726882698.08746: Sent initial data (156 bytes) 25201 1726882698.09870: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.09874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.09876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.09879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.09881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.09883: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.09885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.09887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.09888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.09890: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.09892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.09894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.09896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.09898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.09915: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.09922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.09925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.09927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.09929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.10484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.12222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882698.12320: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882698.12417: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp81gvtfw1 /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/AnsiballZ_command.py <<< 25201 1726882698.12513: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882698.13852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.13941: stderr chunk (state=3): >>><<< 25201 1726882698.13945: stdout chunk (state=3): >>><<< 25201 1726882698.13973: done transferring module to remote 25201 1726882698.13986: _low_level_execute_command(): starting 25201 1726882698.13991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/ /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/AnsiballZ_command.py && sleep 0' 25201 1726882698.14715: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.14727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.14744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.14757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.14798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.14805: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.14815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.14828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.14835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.14849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.14857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.14870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.14882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.14890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.14896: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.14905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.14986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.15002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.15013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.15135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.16970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.16978: stdout chunk (state=3): >>><<< 25201 1726882698.16985: stderr chunk (state=3): >>><<< 25201 1726882698.17001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.17004: _low_level_execute_command(): starting 25201 1726882698.17009: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/AnsiballZ_command.py && sleep 0' 25201 1726882698.18249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.18258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.18272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.18288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.18329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.18337: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.18344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.18357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.18368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.18373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.18381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.18390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.18406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.18413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.18420: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.18429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.18502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.18523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.18535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.18667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.33441: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:38:18.313970", "end": "2024-09-20 21:38:18.332361", "delta": "0:00:00.018391", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882698.34737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882698.34741: stdout chunk (state=3): >>><<< 25201 1726882698.34746: stderr chunk (state=3): >>><<< 25201 1726882698.34805: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:38:18.313970", "end": "2024-09-20 21:38:18.332361", "delta": "0:00:00.018391", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882698.34850: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882698.34854: _low_level_execute_command(): starting 25201 1726882698.34856: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882698.0530896-26014-191367721630125/ > /dev/null 2>&1 && sleep 0' 25201 1726882698.35402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.35408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.35435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.35442: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.35450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.35459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.35470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.35474: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.35486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.35492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.35497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882698.35509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.35555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.35580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.35586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.35686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.37888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.37949: stderr chunk (state=3): >>><<< 25201 1726882698.37952: stdout chunk (state=3): >>><<< 25201 1726882698.38276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.38281: handler run complete 25201 1726882698.38284: Evaluated conditional (False): False 25201 1726882698.38286: attempt loop complete, returning result 25201 1726882698.38288: _execute() done 25201 1726882698.38290: dumping result to json 25201 1726882698.38292: done dumping result, returning 25201 1726882698.38293: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-313b-197e-0000000004b3] 25201 1726882698.38295: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b3 25201 1726882698.38365: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b3 25201 1726882698.38369: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.018391", "end": "2024-09-20 21:38:18.332361", "rc": 0, "start": "2024-09-20 21:38:18.313970" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 25201 1726882698.38837: no more pending results, returning what we have 25201 1726882698.38841: results queue empty 25201 1726882698.38842: checking for any_errors_fatal 25201 1726882698.38846: done checking for any_errors_fatal 25201 1726882698.38847: checking for max_fail_percentage 25201 1726882698.38849: done checking for max_fail_percentage 25201 1726882698.38849: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.38850: done checking to see if all hosts have failed 25201 1726882698.38851: getting the remaining hosts for this loop 25201 1726882698.38852: done getting the remaining hosts for this loop 25201 1726882698.38856: getting the next task for host managed_node2 25201 1726882698.38862: done getting next task for host managed_node2 25201 1726882698.38867: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25201 1726882698.38871: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.38876: getting variables 25201 1726882698.38878: in VariableManager get_vars() 25201 1726882698.38916: Calling all_inventory to load vars for managed_node2 25201 1726882698.38919: Calling groups_inventory to load vars for managed_node2 25201 1726882698.38921: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.38932: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.38935: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.38938: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.40367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.42089: done with get_vars() 25201 1726882698.42116: done getting variables 25201 1726882698.42186: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:38:18 -0400 (0:00:00.434) 0:00:19.596 ****** 25201 1726882698.42216: entering _queue_task() for managed_node2/set_fact 25201 1726882698.42511: worker is 1 (out of 1 available) 25201 1726882698.42524: exiting _queue_task() for managed_node2/set_fact 25201 1726882698.42539: done queuing things up, now waiting for results queue to drain 25201 1726882698.42541: waiting for pending results... 25201 1726882698.42751: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25201 1726882698.42846: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b4 25201 1726882698.42869: variable 'ansible_search_path' from source: unknown 25201 1726882698.42970: variable 'ansible_search_path' from source: unknown 25201 1726882698.42974: calling self._execute() 25201 1726882698.43075: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.43079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.43082: variable 'omit' from source: magic vars 25201 1726882698.43633: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.43644: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.43887: variable 'nm_profile_exists' from source: set_fact 25201 1726882698.43901: Evaluated conditional (nm_profile_exists.rc == 0): True 25201 1726882698.43908: variable 'omit' from source: magic vars 25201 1726882698.43962: variable 'omit' from source: magic vars 25201 1726882698.43993: variable 'omit' from source: magic vars 25201 1726882698.44035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.44093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.44114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.44132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.44144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.44201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.44205: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.44209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.44327: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.44334: Set connection var ansible_pipelining to False 25201 1726882698.44346: Set connection var ansible_connection to ssh 25201 1726882698.44353: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.44355: Set connection var ansible_shell_type to sh 25201 1726882698.44367: Set connection var ansible_timeout to 10 25201 1726882698.44396: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.44399: variable 'ansible_connection' from source: unknown 25201 1726882698.44402: variable 'ansible_module_compression' from source: unknown 25201 1726882698.44404: variable 'ansible_shell_type' from source: unknown 25201 1726882698.44406: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.44408: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.44412: variable 'ansible_pipelining' from source: unknown 25201 1726882698.44414: variable 'ansible_timeout' from source: unknown 25201 1726882698.44417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.44555: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.44568: variable 'omit' from source: magic vars 25201 1726882698.44571: starting attempt loop 25201 1726882698.44574: running the handler 25201 1726882698.44590: handler run complete 25201 1726882698.44600: attempt loop complete, returning result 25201 1726882698.44606: _execute() done 25201 1726882698.44609: dumping result to json 25201 1726882698.44611: done dumping result, returning 25201 1726882698.44620: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-313b-197e-0000000004b4] 25201 1726882698.44625: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b4 25201 1726882698.44710: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b4 25201 1726882698.44713: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 25201 1726882698.44797: no more pending results, returning what we have 25201 1726882698.44800: results queue empty 25201 1726882698.44800: checking for any_errors_fatal 25201 1726882698.44806: done checking for any_errors_fatal 25201 1726882698.44807: checking for max_fail_percentage 25201 1726882698.44808: done checking for max_fail_percentage 25201 1726882698.44809: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.44810: done checking to see if all hosts have failed 25201 1726882698.44811: getting the remaining hosts for this loop 25201 1726882698.44812: done getting the remaining hosts for this loop 25201 1726882698.44815: getting the next task for host managed_node2 25201 1726882698.44823: done getting next task for host managed_node2 25201 1726882698.44825: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 25201 1726882698.44829: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.44832: getting variables 25201 1726882698.44833: in VariableManager get_vars() 25201 1726882698.44868: Calling all_inventory to load vars for managed_node2 25201 1726882698.44870: Calling groups_inventory to load vars for managed_node2 25201 1726882698.44873: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.44881: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.44883: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.44886: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.45851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.47429: done with get_vars() 25201 1726882698.47444: done getting variables 25201 1726882698.47491: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.47578: variable 'profile' from source: include params 25201 1726882698.47582: variable 'interface' from source: play vars 25201 1726882698.47625: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:38:18 -0400 (0:00:00.054) 0:00:19.651 ****** 25201 1726882698.47652: entering _queue_task() for managed_node2/command 25201 1726882698.47845: worker is 1 (out of 1 available) 25201 1726882698.47860: exiting _queue_task() for managed_node2/command 25201 1726882698.47876: done queuing things up, now waiting for results queue to drain 25201 1726882698.47878: waiting for pending results... 25201 1726882698.48041: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-veth0 25201 1726882698.48117: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b6 25201 1726882698.48127: variable 'ansible_search_path' from source: unknown 25201 1726882698.48131: variable 'ansible_search_path' from source: unknown 25201 1726882698.48157: calling self._execute() 25201 1726882698.48233: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.48236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.48244: variable 'omit' from source: magic vars 25201 1726882698.48507: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.48518: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.48605: variable 'profile_stat' from source: set_fact 25201 1726882698.48614: Evaluated conditional (profile_stat.stat.exists): False 25201 1726882698.48619: when evaluation is False, skipping this task 25201 1726882698.48622: _execute() done 25201 1726882698.48625: dumping result to json 25201 1726882698.48627: done dumping result, returning 25201 1726882698.48631: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-313b-197e-0000000004b6] 25201 1726882698.48640: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b6 25201 1726882698.48716: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b6 25201 1726882698.48718: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25201 1726882698.48795: no more pending results, returning what we have 25201 1726882698.48798: results queue empty 25201 1726882698.48799: checking for any_errors_fatal 25201 1726882698.48803: done checking for any_errors_fatal 25201 1726882698.48803: checking for max_fail_percentage 25201 1726882698.48805: done checking for max_fail_percentage 25201 1726882698.48805: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.48806: done checking to see if all hosts have failed 25201 1726882698.48807: getting the remaining hosts for this loop 25201 1726882698.48808: done getting the remaining hosts for this loop 25201 1726882698.48811: getting the next task for host managed_node2 25201 1726882698.48817: done getting next task for host managed_node2 25201 1726882698.48819: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 25201 1726882698.48822: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.48825: getting variables 25201 1726882698.48826: in VariableManager get_vars() 25201 1726882698.48859: Calling all_inventory to load vars for managed_node2 25201 1726882698.48861: Calling groups_inventory to load vars for managed_node2 25201 1726882698.48862: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.48871: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.48873: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.48875: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.49622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.50542: done with get_vars() 25201 1726882698.50556: done getting variables 25201 1726882698.50600: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.50673: variable 'profile' from source: include params 25201 1726882698.50676: variable 'interface' from source: play vars 25201 1726882698.50716: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:38:18 -0400 (0:00:00.030) 0:00:19.682 ****** 25201 1726882698.50737: entering _queue_task() for managed_node2/set_fact 25201 1726882698.50905: worker is 1 (out of 1 available) 25201 1726882698.50916: exiting _queue_task() for managed_node2/set_fact 25201 1726882698.50927: done queuing things up, now waiting for results queue to drain 25201 1726882698.50929: waiting for pending results... 25201 1726882698.51098: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 25201 1726882698.51173: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b7 25201 1726882698.51182: variable 'ansible_search_path' from source: unknown 25201 1726882698.51185: variable 'ansible_search_path' from source: unknown 25201 1726882698.51211: calling self._execute() 25201 1726882698.51282: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.51286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.51294: variable 'omit' from source: magic vars 25201 1726882698.51544: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.51555: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.51638: variable 'profile_stat' from source: set_fact 25201 1726882698.51648: Evaluated conditional (profile_stat.stat.exists): False 25201 1726882698.51651: when evaluation is False, skipping this task 25201 1726882698.51654: _execute() done 25201 1726882698.51657: dumping result to json 25201 1726882698.51659: done dumping result, returning 25201 1726882698.51668: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-313b-197e-0000000004b7] 25201 1726882698.51671: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b7 25201 1726882698.51750: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b7 25201 1726882698.51752: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25201 1726882698.51803: no more pending results, returning what we have 25201 1726882698.51806: results queue empty 25201 1726882698.51807: checking for any_errors_fatal 25201 1726882698.51810: done checking for any_errors_fatal 25201 1726882698.51811: checking for max_fail_percentage 25201 1726882698.51813: done checking for max_fail_percentage 25201 1726882698.51813: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.51814: done checking to see if all hosts have failed 25201 1726882698.51815: getting the remaining hosts for this loop 25201 1726882698.51816: done getting the remaining hosts for this loop 25201 1726882698.51819: getting the next task for host managed_node2 25201 1726882698.51825: done getting next task for host managed_node2 25201 1726882698.51827: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 25201 1726882698.51830: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.51833: getting variables 25201 1726882698.51834: in VariableManager get_vars() 25201 1726882698.51868: Calling all_inventory to load vars for managed_node2 25201 1726882698.51870: Calling groups_inventory to load vars for managed_node2 25201 1726882698.51873: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.51879: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.51881: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.51882: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.52721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.53626: done with get_vars() 25201 1726882698.53640: done getting variables 25201 1726882698.53681: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.53750: variable 'profile' from source: include params 25201 1726882698.53752: variable 'interface' from source: play vars 25201 1726882698.53793: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:38:18 -0400 (0:00:00.030) 0:00:19.712 ****** 25201 1726882698.53813: entering _queue_task() for managed_node2/command 25201 1726882698.53980: worker is 1 (out of 1 available) 25201 1726882698.53991: exiting _queue_task() for managed_node2/command 25201 1726882698.54002: done queuing things up, now waiting for results queue to drain 25201 1726882698.54004: waiting for pending results... 25201 1726882698.54152: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-veth0 25201 1726882698.54227: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b8 25201 1726882698.54236: variable 'ansible_search_path' from source: unknown 25201 1726882698.54240: variable 'ansible_search_path' from source: unknown 25201 1726882698.54269: calling self._execute() 25201 1726882698.54331: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.54334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.54344: variable 'omit' from source: magic vars 25201 1726882698.54587: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.54596: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.54679: variable 'profile_stat' from source: set_fact 25201 1726882698.54689: Evaluated conditional (profile_stat.stat.exists): False 25201 1726882698.54692: when evaluation is False, skipping this task 25201 1726882698.54694: _execute() done 25201 1726882698.54697: dumping result to json 25201 1726882698.54699: done dumping result, returning 25201 1726882698.54705: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-313b-197e-0000000004b8] 25201 1726882698.54711: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b8 25201 1726882698.54792: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b8 25201 1726882698.54795: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25201 1726882698.54845: no more pending results, returning what we have 25201 1726882698.54848: results queue empty 25201 1726882698.54849: checking for any_errors_fatal 25201 1726882698.54855: done checking for any_errors_fatal 25201 1726882698.54855: checking for max_fail_percentage 25201 1726882698.54857: done checking for max_fail_percentage 25201 1726882698.54858: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.54859: done checking to see if all hosts have failed 25201 1726882698.54859: getting the remaining hosts for this loop 25201 1726882698.54861: done getting the remaining hosts for this loop 25201 1726882698.54867: getting the next task for host managed_node2 25201 1726882698.54873: done getting next task for host managed_node2 25201 1726882698.54875: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 25201 1726882698.54878: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.54881: getting variables 25201 1726882698.54882: in VariableManager get_vars() 25201 1726882698.54907: Calling all_inventory to load vars for managed_node2 25201 1726882698.54909: Calling groups_inventory to load vars for managed_node2 25201 1726882698.54911: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.54917: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.54918: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.54920: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.55666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.56654: done with get_vars() 25201 1726882698.56674: done getting variables 25201 1726882698.56710: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.56781: variable 'profile' from source: include params 25201 1726882698.56784: variable 'interface' from source: play vars 25201 1726882698.56821: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:38:18 -0400 (0:00:00.030) 0:00:19.743 ****** 25201 1726882698.56841: entering _queue_task() for managed_node2/set_fact 25201 1726882698.57003: worker is 1 (out of 1 available) 25201 1726882698.57016: exiting _queue_task() for managed_node2/set_fact 25201 1726882698.57027: done queuing things up, now waiting for results queue to drain 25201 1726882698.57031: waiting for pending results... 25201 1726882698.57179: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-veth0 25201 1726882698.57245: in run() - task 0e448fcc-3ce9-313b-197e-0000000004b9 25201 1726882698.57255: variable 'ansible_search_path' from source: unknown 25201 1726882698.57258: variable 'ansible_search_path' from source: unknown 25201 1726882698.57285: calling self._execute() 25201 1726882698.57343: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.57347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.57355: variable 'omit' from source: magic vars 25201 1726882698.57589: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.57598: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.57678: variable 'profile_stat' from source: set_fact 25201 1726882698.57688: Evaluated conditional (profile_stat.stat.exists): False 25201 1726882698.57691: when evaluation is False, skipping this task 25201 1726882698.57694: _execute() done 25201 1726882698.57696: dumping result to json 25201 1726882698.57698: done dumping result, returning 25201 1726882698.57704: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-313b-197e-0000000004b9] 25201 1726882698.57709: sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b9 25201 1726882698.57791: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000004b9 25201 1726882698.57795: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25201 1726882698.57844: no more pending results, returning what we have 25201 1726882698.57847: results queue empty 25201 1726882698.57848: checking for any_errors_fatal 25201 1726882698.57851: done checking for any_errors_fatal 25201 1726882698.57852: checking for max_fail_percentage 25201 1726882698.57853: done checking for max_fail_percentage 25201 1726882698.57854: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.57855: done checking to see if all hosts have failed 25201 1726882698.57856: getting the remaining hosts for this loop 25201 1726882698.57857: done getting the remaining hosts for this loop 25201 1726882698.57860: getting the next task for host managed_node2 25201 1726882698.57871: done getting next task for host managed_node2 25201 1726882698.57874: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 25201 1726882698.57876: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.57879: getting variables 25201 1726882698.57881: in VariableManager get_vars() 25201 1726882698.57910: Calling all_inventory to load vars for managed_node2 25201 1726882698.57912: Calling groups_inventory to load vars for managed_node2 25201 1726882698.57914: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.57920: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.57921: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.57923: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.58659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.59591: done with get_vars() 25201 1726882698.59606: done getting variables 25201 1726882698.59648: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.59726: variable 'profile' from source: include params 25201 1726882698.59729: variable 'interface' from source: play vars 25201 1726882698.59772: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:38:18 -0400 (0:00:00.029) 0:00:19.772 ****** 25201 1726882698.59793: entering _queue_task() for managed_node2/assert 25201 1726882698.59991: worker is 1 (out of 1 available) 25201 1726882698.60004: exiting _queue_task() for managed_node2/assert 25201 1726882698.60017: done queuing things up, now waiting for results queue to drain 25201 1726882698.60019: waiting for pending results... 25201 1726882698.60182: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'veth0' 25201 1726882698.60251: in run() - task 0e448fcc-3ce9-313b-197e-0000000003b9 25201 1726882698.60261: variable 'ansible_search_path' from source: unknown 25201 1726882698.60270: variable 'ansible_search_path' from source: unknown 25201 1726882698.60302: calling self._execute() 25201 1726882698.60362: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.60370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.60377: variable 'omit' from source: magic vars 25201 1726882698.60628: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.60639: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.60644: variable 'omit' from source: magic vars 25201 1726882698.60677: variable 'omit' from source: magic vars 25201 1726882698.60745: variable 'profile' from source: include params 25201 1726882698.60749: variable 'interface' from source: play vars 25201 1726882698.60795: variable 'interface' from source: play vars 25201 1726882698.60809: variable 'omit' from source: magic vars 25201 1726882698.60845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.60870: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.60885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.60899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.60909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.60932: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.60935: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.60938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.61008: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.61011: Set connection var ansible_pipelining to False 25201 1726882698.61018: Set connection var ansible_connection to ssh 25201 1726882698.61020: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.61023: Set connection var ansible_shell_type to sh 25201 1726882698.61030: Set connection var ansible_timeout to 10 25201 1726882698.61046: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.61048: variable 'ansible_connection' from source: unknown 25201 1726882698.61050: variable 'ansible_module_compression' from source: unknown 25201 1726882698.61053: variable 'ansible_shell_type' from source: unknown 25201 1726882698.61056: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.61059: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.61074: variable 'ansible_pipelining' from source: unknown 25201 1726882698.61078: variable 'ansible_timeout' from source: unknown 25201 1726882698.61080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.61169: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.61177: variable 'omit' from source: magic vars 25201 1726882698.61180: starting attempt loop 25201 1726882698.61183: running the handler 25201 1726882698.61257: variable 'lsr_net_profile_exists' from source: set_fact 25201 1726882698.61262: Evaluated conditional (lsr_net_profile_exists): True 25201 1726882698.61269: handler run complete 25201 1726882698.61281: attempt loop complete, returning result 25201 1726882698.61284: _execute() done 25201 1726882698.61287: dumping result to json 25201 1726882698.61290: done dumping result, returning 25201 1726882698.61299: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'veth0' [0e448fcc-3ce9-313b-197e-0000000003b9] 25201 1726882698.61301: sending task result for task 0e448fcc-3ce9-313b-197e-0000000003b9 25201 1726882698.61377: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000003b9 25201 1726882698.61380: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882698.61433: no more pending results, returning what we have 25201 1726882698.61436: results queue empty 25201 1726882698.61437: checking for any_errors_fatal 25201 1726882698.61442: done checking for any_errors_fatal 25201 1726882698.61442: checking for max_fail_percentage 25201 1726882698.61444: done checking for max_fail_percentage 25201 1726882698.61445: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.61445: done checking to see if all hosts have failed 25201 1726882698.61446: getting the remaining hosts for this loop 25201 1726882698.61448: done getting the remaining hosts for this loop 25201 1726882698.61451: getting the next task for host managed_node2 25201 1726882698.61456: done getting next task for host managed_node2 25201 1726882698.61459: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 25201 1726882698.61461: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.61468: getting variables 25201 1726882698.61470: in VariableManager get_vars() 25201 1726882698.61502: Calling all_inventory to load vars for managed_node2 25201 1726882698.61504: Calling groups_inventory to load vars for managed_node2 25201 1726882698.61506: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.61521: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.61523: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.61526: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.65732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.67188: done with get_vars() 25201 1726882698.67209: done getting variables 25201 1726882698.67253: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.67347: variable 'profile' from source: include params 25201 1726882698.67350: variable 'interface' from source: play vars 25201 1726882698.67409: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:38:18 -0400 (0:00:00.076) 0:00:19.848 ****** 25201 1726882698.67437: entering _queue_task() for managed_node2/assert 25201 1726882698.67735: worker is 1 (out of 1 available) 25201 1726882698.67747: exiting _queue_task() for managed_node2/assert 25201 1726882698.67760: done queuing things up, now waiting for results queue to drain 25201 1726882698.67762: waiting for pending results... 25201 1726882698.68042: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'veth0' 25201 1726882698.68155: in run() - task 0e448fcc-3ce9-313b-197e-0000000003ba 25201 1726882698.68177: variable 'ansible_search_path' from source: unknown 25201 1726882698.68185: variable 'ansible_search_path' from source: unknown 25201 1726882698.68227: calling self._execute() 25201 1726882698.68327: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.68338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.68352: variable 'omit' from source: magic vars 25201 1726882698.68716: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.68733: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.68748: variable 'omit' from source: magic vars 25201 1726882698.68794: variable 'omit' from source: magic vars 25201 1726882698.68900: variable 'profile' from source: include params 25201 1726882698.68910: variable 'interface' from source: play vars 25201 1726882698.68983: variable 'interface' from source: play vars 25201 1726882698.69006: variable 'omit' from source: magic vars 25201 1726882698.69047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.69094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.69118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.69138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.69153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.69194: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.69202: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.69210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.69315: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.69325: Set connection var ansible_pipelining to False 25201 1726882698.69333: Set connection var ansible_connection to ssh 25201 1726882698.69342: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.69348: Set connection var ansible_shell_type to sh 25201 1726882698.69358: Set connection var ansible_timeout to 10 25201 1726882698.69386: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.69397: variable 'ansible_connection' from source: unknown 25201 1726882698.69404: variable 'ansible_module_compression' from source: unknown 25201 1726882698.69409: variable 'ansible_shell_type' from source: unknown 25201 1726882698.69416: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.69422: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.69428: variable 'ansible_pipelining' from source: unknown 25201 1726882698.69435: variable 'ansible_timeout' from source: unknown 25201 1726882698.69441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.69583: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.69598: variable 'omit' from source: magic vars 25201 1726882698.69607: starting attempt loop 25201 1726882698.69616: running the handler 25201 1726882698.69727: variable 'lsr_net_profile_ansible_managed' from source: set_fact 25201 1726882698.69737: Evaluated conditional (lsr_net_profile_ansible_managed): True 25201 1726882698.69745: handler run complete 25201 1726882698.69768: attempt loop complete, returning result 25201 1726882698.69776: _execute() done 25201 1726882698.69782: dumping result to json 25201 1726882698.69788: done dumping result, returning 25201 1726882698.69799: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'veth0' [0e448fcc-3ce9-313b-197e-0000000003ba] 25201 1726882698.69809: sending task result for task 0e448fcc-3ce9-313b-197e-0000000003ba 25201 1726882698.69911: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000003ba 25201 1726882698.69918: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882698.69984: no more pending results, returning what we have 25201 1726882698.69988: results queue empty 25201 1726882698.69993: checking for any_errors_fatal 25201 1726882698.70000: done checking for any_errors_fatal 25201 1726882698.70001: checking for max_fail_percentage 25201 1726882698.70003: done checking for max_fail_percentage 25201 1726882698.70004: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.70005: done checking to see if all hosts have failed 25201 1726882698.70006: getting the remaining hosts for this loop 25201 1726882698.70007: done getting the remaining hosts for this loop 25201 1726882698.70011: getting the next task for host managed_node2 25201 1726882698.70017: done getting next task for host managed_node2 25201 1726882698.70019: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 25201 1726882698.70022: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.70027: getting variables 25201 1726882698.70029: in VariableManager get_vars() 25201 1726882698.70075: Calling all_inventory to load vars for managed_node2 25201 1726882698.70078: Calling groups_inventory to load vars for managed_node2 25201 1726882698.70081: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.70092: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.70095: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.70098: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.71721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.74097: done with get_vars() 25201 1726882698.74119: done getting variables 25201 1726882698.74177: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882698.74286: variable 'profile' from source: include params 25201 1726882698.74290: variable 'interface' from source: play vars 25201 1726882698.74347: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:38:18 -0400 (0:00:00.069) 0:00:19.918 ****** 25201 1726882698.74384: entering _queue_task() for managed_node2/assert 25201 1726882698.74661: worker is 1 (out of 1 available) 25201 1726882698.74678: exiting _queue_task() for managed_node2/assert 25201 1726882698.74691: done queuing things up, now waiting for results queue to drain 25201 1726882698.74692: waiting for pending results... 25201 1726882698.74959: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in veth0 25201 1726882698.75074: in run() - task 0e448fcc-3ce9-313b-197e-0000000003bb 25201 1726882698.75092: variable 'ansible_search_path' from source: unknown 25201 1726882698.75099: variable 'ansible_search_path' from source: unknown 25201 1726882698.75139: calling self._execute() 25201 1726882698.75232: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.75247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.75260: variable 'omit' from source: magic vars 25201 1726882698.75616: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.75632: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.75642: variable 'omit' from source: magic vars 25201 1726882698.75687: variable 'omit' from source: magic vars 25201 1726882698.75792: variable 'profile' from source: include params 25201 1726882698.75802: variable 'interface' from source: play vars 25201 1726882698.75869: variable 'interface' from source: play vars 25201 1726882698.75896: variable 'omit' from source: magic vars 25201 1726882698.75939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.75980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.76008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.76029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.76045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.76081: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.76090: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.76096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.76201: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.76216: Set connection var ansible_pipelining to False 25201 1726882698.76225: Set connection var ansible_connection to ssh 25201 1726882698.76235: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.76243: Set connection var ansible_shell_type to sh 25201 1726882698.76258: Set connection var ansible_timeout to 10 25201 1726882698.76292: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.76302: variable 'ansible_connection' from source: unknown 25201 1726882698.76311: variable 'ansible_module_compression' from source: unknown 25201 1726882698.76324: variable 'ansible_shell_type' from source: unknown 25201 1726882698.76344: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.76355: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.76441: variable 'ansible_pipelining' from source: unknown 25201 1726882698.76449: variable 'ansible_timeout' from source: unknown 25201 1726882698.76458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.76594: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.76609: variable 'omit' from source: magic vars 25201 1726882698.76617: starting attempt loop 25201 1726882698.76622: running the handler 25201 1726882698.76735: variable 'lsr_net_profile_fingerprint' from source: set_fact 25201 1726882698.76744: Evaluated conditional (lsr_net_profile_fingerprint): True 25201 1726882698.76751: handler run complete 25201 1726882698.76775: attempt loop complete, returning result 25201 1726882698.76782: _execute() done 25201 1726882698.76789: dumping result to json 25201 1726882698.76796: done dumping result, returning 25201 1726882698.76805: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in veth0 [0e448fcc-3ce9-313b-197e-0000000003bb] 25201 1726882698.76813: sending task result for task 0e448fcc-3ce9-313b-197e-0000000003bb ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882698.76954: no more pending results, returning what we have 25201 1726882698.76957: results queue empty 25201 1726882698.76958: checking for any_errors_fatal 25201 1726882698.76968: done checking for any_errors_fatal 25201 1726882698.76969: checking for max_fail_percentage 25201 1726882698.76971: done checking for max_fail_percentage 25201 1726882698.76971: checking to see if all hosts have failed and the running result is not ok 25201 1726882698.76972: done checking to see if all hosts have failed 25201 1726882698.76973: getting the remaining hosts for this loop 25201 1726882698.76975: done getting the remaining hosts for this loop 25201 1726882698.76979: getting the next task for host managed_node2 25201 1726882698.76988: done getting next task for host managed_node2 25201 1726882698.76991: ^ task is: TASK: Get ip address information 25201 1726882698.76992: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882698.76996: getting variables 25201 1726882698.76998: in VariableManager get_vars() 25201 1726882698.77042: Calling all_inventory to load vars for managed_node2 25201 1726882698.77045: Calling groups_inventory to load vars for managed_node2 25201 1726882698.77048: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882698.77058: Calling all_plugins_play to load vars for managed_node2 25201 1726882698.77062: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882698.77069: Calling groups_plugins_play to load vars for managed_node2 25201 1726882698.78385: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000003bb 25201 1726882698.78389: WORKER PROCESS EXITING 25201 1726882698.78823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882698.81223: done with get_vars() 25201 1726882698.81254: done getting variables 25201 1726882698.81325: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Friday 20 September 2024 21:38:18 -0400 (0:00:00.069) 0:00:19.988 ****** 25201 1726882698.81354: entering _queue_task() for managed_node2/command 25201 1726882698.82371: worker is 1 (out of 1 available) 25201 1726882698.82384: exiting _queue_task() for managed_node2/command 25201 1726882698.82396: done queuing things up, now waiting for results queue to drain 25201 1726882698.82397: waiting for pending results... 25201 1726882698.82988: running TaskExecutor() for managed_node2/TASK: Get ip address information 25201 1726882698.83101: in run() - task 0e448fcc-3ce9-313b-197e-00000000005e 25201 1726882698.83120: variable 'ansible_search_path' from source: unknown 25201 1726882698.83172: calling self._execute() 25201 1726882698.83287: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.83299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.83314: variable 'omit' from source: magic vars 25201 1726882698.83730: variable 'ansible_distribution_major_version' from source: facts 25201 1726882698.83747: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882698.83757: variable 'omit' from source: magic vars 25201 1726882698.83790: variable 'omit' from source: magic vars 25201 1726882698.83900: variable 'interface' from source: play vars 25201 1726882698.83928: variable 'omit' from source: magic vars 25201 1726882698.83978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882698.84025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882698.84050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882698.84077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.84094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882698.84138: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882698.84148: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.84156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.84279: Set connection var ansible_shell_executable to /bin/sh 25201 1726882698.84289: Set connection var ansible_pipelining to False 25201 1726882698.84298: Set connection var ansible_connection to ssh 25201 1726882698.84306: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882698.84312: Set connection var ansible_shell_type to sh 25201 1726882698.84322: Set connection var ansible_timeout to 10 25201 1726882698.84353: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.84360: variable 'ansible_connection' from source: unknown 25201 1726882698.84372: variable 'ansible_module_compression' from source: unknown 25201 1726882698.84378: variable 'ansible_shell_type' from source: unknown 25201 1726882698.84384: variable 'ansible_shell_executable' from source: unknown 25201 1726882698.84390: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882698.84396: variable 'ansible_pipelining' from source: unknown 25201 1726882698.84403: variable 'ansible_timeout' from source: unknown 25201 1726882698.84410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882698.84558: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882698.84585: variable 'omit' from source: magic vars 25201 1726882698.84594: starting attempt loop 25201 1726882698.84601: running the handler 25201 1726882698.84623: _low_level_execute_command(): starting 25201 1726882698.84636: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882698.85467: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.85484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.85499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.85516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.85570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.85584: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.85599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.85619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.85631: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.85642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.85662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.85684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.85701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.85713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.85725: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.85739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.85824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.85840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.85853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.86002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.87684: stdout chunk (state=3): >>>/root <<< 25201 1726882698.87875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.87878: stdout chunk (state=3): >>><<< 25201 1726882698.87881: stderr chunk (state=3): >>><<< 25201 1726882698.88000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.88004: _low_level_execute_command(): starting 25201 1726882698.88007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204 `" && echo ansible-tmp-1726882698.8790197-26058-267173638928204="` echo /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204 `" ) && sleep 0' 25201 1726882698.90202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.90206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.90842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882698.90845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.90848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.91280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.93105: stdout chunk (state=3): >>>ansible-tmp-1726882698.8790197-26058-267173638928204=/root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204 <<< 25201 1726882698.93211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882698.93284: stderr chunk (state=3): >>><<< 25201 1726882698.93287: stdout chunk (state=3): >>><<< 25201 1726882698.93608: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882698.8790197-26058-267173638928204=/root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882698.93612: variable 'ansible_module_compression' from source: unknown 25201 1726882698.93614: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882698.93617: variable 'ansible_facts' from source: unknown 25201 1726882698.93619: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/AnsiballZ_command.py 25201 1726882698.94117: Sending initial data 25201 1726882698.94120: Sent initial data (156 bytes) 25201 1726882698.96008: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882698.96137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.96153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.96179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.96223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.96351: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882698.96370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.96389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882698.96401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882698.96411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882698.96422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882698.96434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882698.96452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882698.96468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882698.96480: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882698.96492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882698.96577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882698.96594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882698.96608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882698.96792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882698.98579: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882698.98674: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882698.98772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpgmxxgaxw /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/AnsiballZ_command.py <<< 25201 1726882698.98868: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882699.00384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.00549: stderr chunk (state=3): >>><<< 25201 1726882699.00552: stdout chunk (state=3): >>><<< 25201 1726882699.00555: done transferring module to remote 25201 1726882699.00557: _low_level_execute_command(): starting 25201 1726882699.00559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/ /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/AnsiballZ_command.py && sleep 0' 25201 1726882699.02225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.02228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.02246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.02253: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882699.02269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.02280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882699.02287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882699.02293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882699.02300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.02308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.02319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.02326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.02332: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882699.02350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.02424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.02439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.02449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.02577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.04399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.04403: stdout chunk (state=3): >>><<< 25201 1726882699.04410: stderr chunk (state=3): >>><<< 25201 1726882699.04426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.04429: _low_level_execute_command(): starting 25201 1726882699.04435: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/AnsiballZ_command.py && sleep 0' 25201 1726882699.06150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.06155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.06271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.06288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882699.06294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.06300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.06305: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882699.06317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.06418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.06686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.06903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.20341: stdout chunk (state=3): >>> {"changed": true, "stdout": "28: veth0@if27: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 16:6a:32:5d:00:24 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::2645:824:bc1:2589/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:38:19.197381", "end": "2024-09-20 21:38:19.201272", "delta": "0:00:00.003891", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882699.21648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882699.21652: stdout chunk (state=3): >>><<< 25201 1726882699.21655: stderr chunk (state=3): >>><<< 25201 1726882699.21813: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "28: veth0@if27: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 16:6a:32:5d:00:24 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::2645:824:bc1:2589/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:38:19.197381", "end": "2024-09-20 21:38:19.201272", "delta": "0:00:00.003891", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882699.21823: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882699.21826: _low_level_execute_command(): starting 25201 1726882699.21828: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882698.8790197-26058-267173638928204/ > /dev/null 2>&1 && sleep 0' 25201 1726882699.23256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882699.23344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.23360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.23384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.23426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.23443: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882699.23458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.23483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882699.23562: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882699.23580: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882699.23594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.23608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.23624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.23637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.23650: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882699.23673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.23748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.23895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.23910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.24112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.26043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.26047: stdout chunk (state=3): >>><<< 25201 1726882699.26050: stderr chunk (state=3): >>><<< 25201 1726882699.26375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.26378: handler run complete 25201 1726882699.26381: Evaluated conditional (False): False 25201 1726882699.26383: attempt loop complete, returning result 25201 1726882699.26385: _execute() done 25201 1726882699.26387: dumping result to json 25201 1726882699.26389: done dumping result, returning 25201 1726882699.26392: done running TaskExecutor() for managed_node2/TASK: Get ip address information [0e448fcc-3ce9-313b-197e-00000000005e] 25201 1726882699.26394: sending task result for task 0e448fcc-3ce9-313b-197e-00000000005e 25201 1726882699.26472: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000005e 25201 1726882699.26475: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003891", "end": "2024-09-20 21:38:19.201272", "rc": 0, "start": "2024-09-20 21:38:19.197381" } STDOUT: 28: veth0@if27: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 16:6a:32:5d:00:24 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::2645:824:bc1:2589/64 scope link noprefixroute valid_lft forever preferred_lft forever 25201 1726882699.26557: no more pending results, returning what we have 25201 1726882699.26561: results queue empty 25201 1726882699.26562: checking for any_errors_fatal 25201 1726882699.26574: done checking for any_errors_fatal 25201 1726882699.26575: checking for max_fail_percentage 25201 1726882699.26577: done checking for max_fail_percentage 25201 1726882699.26577: checking to see if all hosts have failed and the running result is not ok 25201 1726882699.26578: done checking to see if all hosts have failed 25201 1726882699.26579: getting the remaining hosts for this loop 25201 1726882699.26581: done getting the remaining hosts for this loop 25201 1726882699.26584: getting the next task for host managed_node2 25201 1726882699.26591: done getting next task for host managed_node2 25201 1726882699.26594: ^ task is: TASK: Show ip_addr 25201 1726882699.26595: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882699.26598: getting variables 25201 1726882699.26600: in VariableManager get_vars() 25201 1726882699.26643: Calling all_inventory to load vars for managed_node2 25201 1726882699.26646: Calling groups_inventory to load vars for managed_node2 25201 1726882699.26649: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882699.26661: Calling all_plugins_play to load vars for managed_node2 25201 1726882699.26668: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882699.26672: Calling groups_plugins_play to load vars for managed_node2 25201 1726882699.29101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882699.32705: done with get_vars() 25201 1726882699.32730: done getting variables 25201 1726882699.32794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Friday 20 September 2024 21:38:19 -0400 (0:00:00.514) 0:00:20.502 ****** 25201 1726882699.32823: entering _queue_task() for managed_node2/debug 25201 1726882699.33150: worker is 1 (out of 1 available) 25201 1726882699.33167: exiting _queue_task() for managed_node2/debug 25201 1726882699.33180: done queuing things up, now waiting for results queue to drain 25201 1726882699.33182: waiting for pending results... 25201 1726882699.33467: running TaskExecutor() for managed_node2/TASK: Show ip_addr 25201 1726882699.33571: in run() - task 0e448fcc-3ce9-313b-197e-00000000005f 25201 1726882699.33591: variable 'ansible_search_path' from source: unknown 25201 1726882699.33637: calling self._execute() 25201 1726882699.33736: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.33748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.33768: variable 'omit' from source: magic vars 25201 1726882699.34140: variable 'ansible_distribution_major_version' from source: facts 25201 1726882699.34161: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882699.34186: variable 'omit' from source: magic vars 25201 1726882699.34212: variable 'omit' from source: magic vars 25201 1726882699.34261: variable 'omit' from source: magic vars 25201 1726882699.34320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882699.34358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882699.34389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882699.34415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.34433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.34470: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882699.34480: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.34488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.34602: Set connection var ansible_shell_executable to /bin/sh 25201 1726882699.34619: Set connection var ansible_pipelining to False 25201 1726882699.34629: Set connection var ansible_connection to ssh 25201 1726882699.34638: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882699.34645: Set connection var ansible_shell_type to sh 25201 1726882699.34656: Set connection var ansible_timeout to 10 25201 1726882699.34688: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.34696: variable 'ansible_connection' from source: unknown 25201 1726882699.34703: variable 'ansible_module_compression' from source: unknown 25201 1726882699.34710: variable 'ansible_shell_type' from source: unknown 25201 1726882699.34718: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.34727: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.34735: variable 'ansible_pipelining' from source: unknown 25201 1726882699.34741: variable 'ansible_timeout' from source: unknown 25201 1726882699.34749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.35020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882699.35037: variable 'omit' from source: magic vars 25201 1726882699.35049: starting attempt loop 25201 1726882699.35058: running the handler 25201 1726882699.35308: variable 'ip_addr' from source: set_fact 25201 1726882699.35394: handler run complete 25201 1726882699.35417: attempt loop complete, returning result 25201 1726882699.35497: _execute() done 25201 1726882699.35504: dumping result to json 25201 1726882699.35512: done dumping result, returning 25201 1726882699.35523: done running TaskExecutor() for managed_node2/TASK: Show ip_addr [0e448fcc-3ce9-313b-197e-00000000005f] 25201 1726882699.35534: sending task result for task 0e448fcc-3ce9-313b-197e-00000000005f ok: [managed_node2] => { "ip_addr.stdout": "28: veth0@if27: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 16:6a:32:5d:00:24 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::2645:824:bc1:2589/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 25201 1726882699.35693: no more pending results, returning what we have 25201 1726882699.35696: results queue empty 25201 1726882699.35698: checking for any_errors_fatal 25201 1726882699.35709: done checking for any_errors_fatal 25201 1726882699.35710: checking for max_fail_percentage 25201 1726882699.35712: done checking for max_fail_percentage 25201 1726882699.35713: checking to see if all hosts have failed and the running result is not ok 25201 1726882699.35714: done checking to see if all hosts have failed 25201 1726882699.35715: getting the remaining hosts for this loop 25201 1726882699.35717: done getting the remaining hosts for this loop 25201 1726882699.35721: getting the next task for host managed_node2 25201 1726882699.35729: done getting next task for host managed_node2 25201 1726882699.35732: ^ task is: TASK: Assert ipv6 addresses are correctly set 25201 1726882699.35734: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882699.35737: getting variables 25201 1726882699.35739: in VariableManager get_vars() 25201 1726882699.35788: Calling all_inventory to load vars for managed_node2 25201 1726882699.35792: Calling groups_inventory to load vars for managed_node2 25201 1726882699.35794: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882699.35806: Calling all_plugins_play to load vars for managed_node2 25201 1726882699.35810: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882699.35813: Calling groups_plugins_play to load vars for managed_node2 25201 1726882699.37253: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000005f 25201 1726882699.37257: WORKER PROCESS EXITING 25201 1726882699.37950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882699.39778: done with get_vars() 25201 1726882699.39800: done getting variables 25201 1726882699.39857: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Friday 20 September 2024 21:38:19 -0400 (0:00:00.070) 0:00:20.573 ****** 25201 1726882699.39893: entering _queue_task() for managed_node2/assert 25201 1726882699.40171: worker is 1 (out of 1 available) 25201 1726882699.40182: exiting _queue_task() for managed_node2/assert 25201 1726882699.40195: done queuing things up, now waiting for results queue to drain 25201 1726882699.40196: waiting for pending results... 25201 1726882699.40956: running TaskExecutor() for managed_node2/TASK: Assert ipv6 addresses are correctly set 25201 1726882699.41053: in run() - task 0e448fcc-3ce9-313b-197e-000000000060 25201 1726882699.41081: variable 'ansible_search_path' from source: unknown 25201 1726882699.41119: calling self._execute() 25201 1726882699.41215: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.41226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.41239: variable 'omit' from source: magic vars 25201 1726882699.41627: variable 'ansible_distribution_major_version' from source: facts 25201 1726882699.41646: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882699.41658: variable 'omit' from source: magic vars 25201 1726882699.41687: variable 'omit' from source: magic vars 25201 1726882699.41732: variable 'omit' from source: magic vars 25201 1726882699.41781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882699.41822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882699.41851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882699.41878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.41895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.41929: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882699.41941: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.41949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.42060: Set connection var ansible_shell_executable to /bin/sh 25201 1726882699.42074: Set connection var ansible_pipelining to False 25201 1726882699.42084: Set connection var ansible_connection to ssh 25201 1726882699.42093: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882699.42099: Set connection var ansible_shell_type to sh 25201 1726882699.42110: Set connection var ansible_timeout to 10 25201 1726882699.42134: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.42141: variable 'ansible_connection' from source: unknown 25201 1726882699.42148: variable 'ansible_module_compression' from source: unknown 25201 1726882699.42158: variable 'ansible_shell_type' from source: unknown 25201 1726882699.42168: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.42175: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.42182: variable 'ansible_pipelining' from source: unknown 25201 1726882699.42187: variable 'ansible_timeout' from source: unknown 25201 1726882699.42193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.42331: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882699.42347: variable 'omit' from source: magic vars 25201 1726882699.42356: starting attempt loop 25201 1726882699.42362: running the handler 25201 1726882699.42511: variable 'ip_addr' from source: set_fact 25201 1726882699.42529: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 25201 1726882699.42655: variable 'ip_addr' from source: set_fact 25201 1726882699.42675: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 25201 1726882699.42799: variable 'ip_addr' from source: set_fact 25201 1726882699.42817: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 25201 1726882699.42828: handler run complete 25201 1726882699.42846: attempt loop complete, returning result 25201 1726882699.42851: _execute() done 25201 1726882699.42857: dumping result to json 25201 1726882699.42866: done dumping result, returning 25201 1726882699.42878: done running TaskExecutor() for managed_node2/TASK: Assert ipv6 addresses are correctly set [0e448fcc-3ce9-313b-197e-000000000060] 25201 1726882699.42888: sending task result for task 0e448fcc-3ce9-313b-197e-000000000060 25201 1726882699.42991: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000060 25201 1726882699.42997: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882699.43073: no more pending results, returning what we have 25201 1726882699.43077: results queue empty 25201 1726882699.43078: checking for any_errors_fatal 25201 1726882699.43089: done checking for any_errors_fatal 25201 1726882699.43090: checking for max_fail_percentage 25201 1726882699.43092: done checking for max_fail_percentage 25201 1726882699.43093: checking to see if all hosts have failed and the running result is not ok 25201 1726882699.43094: done checking to see if all hosts have failed 25201 1726882699.43095: getting the remaining hosts for this loop 25201 1726882699.43096: done getting the remaining hosts for this loop 25201 1726882699.43101: getting the next task for host managed_node2 25201 1726882699.43107: done getting next task for host managed_node2 25201 1726882699.43109: ^ task is: TASK: Get ipv6 routes 25201 1726882699.43111: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882699.43114: getting variables 25201 1726882699.43116: in VariableManager get_vars() 25201 1726882699.43161: Calling all_inventory to load vars for managed_node2 25201 1726882699.43168: Calling groups_inventory to load vars for managed_node2 25201 1726882699.43174: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882699.43186: Calling all_plugins_play to load vars for managed_node2 25201 1726882699.43190: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882699.43193: Calling groups_plugins_play to load vars for managed_node2 25201 1726882699.45018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882699.46903: done with get_vars() 25201 1726882699.46925: done getting variables 25201 1726882699.47031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Friday 20 September 2024 21:38:19 -0400 (0:00:00.071) 0:00:20.645 ****** 25201 1726882699.47061: entering _queue_task() for managed_node2/command 25201 1726882699.47500: worker is 1 (out of 1 available) 25201 1726882699.47513: exiting _queue_task() for managed_node2/command 25201 1726882699.47531: done queuing things up, now waiting for results queue to drain 25201 1726882699.47533: waiting for pending results... 25201 1726882699.47852: running TaskExecutor() for managed_node2/TASK: Get ipv6 routes 25201 1726882699.47958: in run() - task 0e448fcc-3ce9-313b-197e-000000000061 25201 1726882699.47991: variable 'ansible_search_path' from source: unknown 25201 1726882699.48032: calling self._execute() 25201 1726882699.48171: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.48195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.48228: variable 'omit' from source: magic vars 25201 1726882699.48700: variable 'ansible_distribution_major_version' from source: facts 25201 1726882699.48716: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882699.48726: variable 'omit' from source: magic vars 25201 1726882699.48758: variable 'omit' from source: magic vars 25201 1726882699.48797: variable 'omit' from source: magic vars 25201 1726882699.48842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882699.48901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882699.48926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882699.48949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.48968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.49009: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882699.49019: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.49026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.49253: Set connection var ansible_shell_executable to /bin/sh 25201 1726882699.49265: Set connection var ansible_pipelining to False 25201 1726882699.49275: Set connection var ansible_connection to ssh 25201 1726882699.49284: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882699.49290: Set connection var ansible_shell_type to sh 25201 1726882699.49308: Set connection var ansible_timeout to 10 25201 1726882699.49375: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.49384: variable 'ansible_connection' from source: unknown 25201 1726882699.49392: variable 'ansible_module_compression' from source: unknown 25201 1726882699.49399: variable 'ansible_shell_type' from source: unknown 25201 1726882699.49408: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.49420: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.49433: variable 'ansible_pipelining' from source: unknown 25201 1726882699.49440: variable 'ansible_timeout' from source: unknown 25201 1726882699.49447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.49651: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882699.49670: variable 'omit' from source: magic vars 25201 1726882699.49680: starting attempt loop 25201 1726882699.49687: running the handler 25201 1726882699.49709: _low_level_execute_command(): starting 25201 1726882699.49722: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882699.50562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.50568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.50597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.50601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.50603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.50650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.50661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.50780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.52431: stdout chunk (state=3): >>>/root <<< 25201 1726882699.52605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.52608: stdout chunk (state=3): >>><<< 25201 1726882699.52611: stderr chunk (state=3): >>><<< 25201 1726882699.52715: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.52718: _low_level_execute_command(): starting 25201 1726882699.52723: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980 `" && echo ansible-tmp-1726882699.5263598-26090-23079196040980="` echo /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980 `" ) && sleep 0' 25201 1726882699.53353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.53357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.53399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.53402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.53409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.53449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.53453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.53567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.55443: stdout chunk (state=3): >>>ansible-tmp-1726882699.5263598-26090-23079196040980=/root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980 <<< 25201 1726882699.55553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.55624: stderr chunk (state=3): >>><<< 25201 1726882699.55637: stdout chunk (state=3): >>><<< 25201 1726882699.55883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882699.5263598-26090-23079196040980=/root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.55886: variable 'ansible_module_compression' from source: unknown 25201 1726882699.55888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882699.55890: variable 'ansible_facts' from source: unknown 25201 1726882699.55892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/AnsiballZ_command.py 25201 1726882699.56011: Sending initial data 25201 1726882699.56014: Sent initial data (155 bytes) 25201 1726882699.57047: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.57050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.57094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.57097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.57099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.57146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.57153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.57167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.57275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.59011: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 25201 1726882699.59015: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882699.59110: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882699.59278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpq4wo7ulg /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/AnsiballZ_command.py <<< 25201 1726882699.59383: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882699.60559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.60680: stderr chunk (state=3): >>><<< 25201 1726882699.60683: stdout chunk (state=3): >>><<< 25201 1726882699.60703: done transferring module to remote 25201 1726882699.60712: _low_level_execute_command(): starting 25201 1726882699.60718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/ /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/AnsiballZ_command.py && sleep 0' 25201 1726882699.61392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.61396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.61448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.61451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.61453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.61455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882699.61457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.61513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.61516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.61618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.63369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.63438: stderr chunk (state=3): >>><<< 25201 1726882699.63445: stdout chunk (state=3): >>><<< 25201 1726882699.63460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.63467: _low_level_execute_command(): starting 25201 1726882699.63470: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/AnsiballZ_command.py && sleep 0' 25201 1726882699.64070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.64073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.64106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.64115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.64124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.64135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.64141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.64194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.64214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.64219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.64321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.77744: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:38:19.771860", "end": "2024-09-20 21:38:19.775234", "delta": "0:00:00.003374", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882699.78951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882699.78955: stderr chunk (state=3): >>><<< 25201 1726882699.78957: stdout chunk (state=3): >>><<< 25201 1726882699.78984: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:38:19.771860", "end": "2024-09-20 21:38:19.775234", "delta": "0:00:00.003374", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882699.79022: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882699.79028: _low_level_execute_command(): starting 25201 1726882699.79033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882699.5263598-26090-23079196040980/ > /dev/null 2>&1 && sleep 0' 25201 1726882699.79694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882699.79703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.79714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.79727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.79768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.79778: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882699.79800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.79814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882699.79828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882699.79835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882699.79843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882699.79852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882699.79865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882699.79876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882699.79883: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882699.79893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882699.79962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882699.79982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882699.79985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882699.80121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882699.81909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882699.82081: stderr chunk (state=3): >>><<< 25201 1726882699.82085: stdout chunk (state=3): >>><<< 25201 1726882699.82100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882699.82106: handler run complete 25201 1726882699.82129: Evaluated conditional (False): False 25201 1726882699.82140: attempt loop complete, returning result 25201 1726882699.82143: _execute() done 25201 1726882699.82145: dumping result to json 25201 1726882699.82150: done dumping result, returning 25201 1726882699.82159: done running TaskExecutor() for managed_node2/TASK: Get ipv6 routes [0e448fcc-3ce9-313b-197e-000000000061] 25201 1726882699.82166: sending task result for task 0e448fcc-3ce9-313b-197e-000000000061 25201 1726882699.82279: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000061 25201 1726882699.82282: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003374", "end": "2024-09-20 21:38:19.775234", "rc": 0, "start": "2024-09-20 21:38:19.771860" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 25201 1726882699.82353: no more pending results, returning what we have 25201 1726882699.82356: results queue empty 25201 1726882699.82358: checking for any_errors_fatal 25201 1726882699.82370: done checking for any_errors_fatal 25201 1726882699.82371: checking for max_fail_percentage 25201 1726882699.82373: done checking for max_fail_percentage 25201 1726882699.82376: checking to see if all hosts have failed and the running result is not ok 25201 1726882699.82376: done checking to see if all hosts have failed 25201 1726882699.82377: getting the remaining hosts for this loop 25201 1726882699.82379: done getting the remaining hosts for this loop 25201 1726882699.82383: getting the next task for host managed_node2 25201 1726882699.82388: done getting next task for host managed_node2 25201 1726882699.82390: ^ task is: TASK: Show ipv6_route 25201 1726882699.82392: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882699.82395: getting variables 25201 1726882699.82396: in VariableManager get_vars() 25201 1726882699.82431: Calling all_inventory to load vars for managed_node2 25201 1726882699.82434: Calling groups_inventory to load vars for managed_node2 25201 1726882699.82436: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882699.82445: Calling all_plugins_play to load vars for managed_node2 25201 1726882699.82448: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882699.82450: Calling groups_plugins_play to load vars for managed_node2 25201 1726882699.85100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882699.89652: done with get_vars() 25201 1726882699.89680: done getting variables 25201 1726882699.89737: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Friday 20 September 2024 21:38:19 -0400 (0:00:00.427) 0:00:21.072 ****** 25201 1726882699.89769: entering _queue_task() for managed_node2/debug 25201 1726882699.90061: worker is 1 (out of 1 available) 25201 1726882699.90578: exiting _queue_task() for managed_node2/debug 25201 1726882699.90590: done queuing things up, now waiting for results queue to drain 25201 1726882699.90592: waiting for pending results... 25201 1726882699.90831: running TaskExecutor() for managed_node2/TASK: Show ipv6_route 25201 1726882699.90905: in run() - task 0e448fcc-3ce9-313b-197e-000000000062 25201 1726882699.90918: variable 'ansible_search_path' from source: unknown 25201 1726882699.90950: calling self._execute() 25201 1726882699.91040: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.91046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.91056: variable 'omit' from source: magic vars 25201 1726882699.92224: variable 'ansible_distribution_major_version' from source: facts 25201 1726882699.92236: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882699.92242: variable 'omit' from source: magic vars 25201 1726882699.92263: variable 'omit' from source: magic vars 25201 1726882699.92302: variable 'omit' from source: magic vars 25201 1726882699.92342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882699.92380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882699.92399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882699.92417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.92428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882699.92456: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882699.92460: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.92464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.92567: Set connection var ansible_shell_executable to /bin/sh 25201 1726882699.92574: Set connection var ansible_pipelining to False 25201 1726882699.92585: Set connection var ansible_connection to ssh 25201 1726882699.92595: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882699.92776: Set connection var ansible_shell_type to sh 25201 1726882699.92789: Set connection var ansible_timeout to 10 25201 1726882699.92815: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.92824: variable 'ansible_connection' from source: unknown 25201 1726882699.92832: variable 'ansible_module_compression' from source: unknown 25201 1726882699.92839: variable 'ansible_shell_type' from source: unknown 25201 1726882699.92846: variable 'ansible_shell_executable' from source: unknown 25201 1726882699.92853: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882699.92860: variable 'ansible_pipelining' from source: unknown 25201 1726882699.92870: variable 'ansible_timeout' from source: unknown 25201 1726882699.92877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882699.93001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882699.93091: variable 'omit' from source: magic vars 25201 1726882699.93102: starting attempt loop 25201 1726882699.93109: running the handler 25201 1726882699.93445: variable 'ipv6_route' from source: set_fact 25201 1726882699.93469: handler run complete 25201 1726882699.93493: attempt loop complete, returning result 25201 1726882699.93500: _execute() done 25201 1726882699.93506: dumping result to json 25201 1726882699.93513: done dumping result, returning 25201 1726882699.93524: done running TaskExecutor() for managed_node2/TASK: Show ipv6_route [0e448fcc-3ce9-313b-197e-000000000062] 25201 1726882699.93534: sending task result for task 0e448fcc-3ce9-313b-197e-000000000062 ok: [managed_node2] => { "ipv6_route.stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 25201 1726882699.93714: no more pending results, returning what we have 25201 1726882699.93717: results queue empty 25201 1726882699.93718: checking for any_errors_fatal 25201 1726882699.93725: done checking for any_errors_fatal 25201 1726882699.93726: checking for max_fail_percentage 25201 1726882699.93727: done checking for max_fail_percentage 25201 1726882699.93728: checking to see if all hosts have failed and the running result is not ok 25201 1726882699.93729: done checking to see if all hosts have failed 25201 1726882699.93730: getting the remaining hosts for this loop 25201 1726882699.93731: done getting the remaining hosts for this loop 25201 1726882699.93735: getting the next task for host managed_node2 25201 1726882699.93741: done getting next task for host managed_node2 25201 1726882699.93744: ^ task is: TASK: Assert default ipv6 route is set 25201 1726882699.93745: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882699.93749: getting variables 25201 1726882699.93750: in VariableManager get_vars() 25201 1726882699.93793: Calling all_inventory to load vars for managed_node2 25201 1726882699.93796: Calling groups_inventory to load vars for managed_node2 25201 1726882699.93798: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882699.93810: Calling all_plugins_play to load vars for managed_node2 25201 1726882699.93813: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882699.93817: Calling groups_plugins_play to load vars for managed_node2 25201 1726882699.94474: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000062 25201 1726882699.94477: WORKER PROCESS EXITING 25201 1726882699.97835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882699.99572: done with get_vars() 25201 1726882699.99594: done getting variables 25201 1726882699.99649: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Friday 20 September 2024 21:38:19 -0400 (0:00:00.099) 0:00:21.171 ****** 25201 1726882699.99682: entering _queue_task() for managed_node2/assert 25201 1726882699.99949: worker is 1 (out of 1 available) 25201 1726882699.99962: exiting _queue_task() for managed_node2/assert 25201 1726882699.99976: done queuing things up, now waiting for results queue to drain 25201 1726882699.99977: waiting for pending results... 25201 1726882700.00240: running TaskExecutor() for managed_node2/TASK: Assert default ipv6 route is set 25201 1726882700.00332: in run() - task 0e448fcc-3ce9-313b-197e-000000000063 25201 1726882700.00351: variable 'ansible_search_path' from source: unknown 25201 1726882700.00396: calling self._execute() 25201 1726882700.00500: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.00511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.00529: variable 'omit' from source: magic vars 25201 1726882700.00906: variable 'ansible_distribution_major_version' from source: facts 25201 1726882700.00922: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882700.00932: variable 'omit' from source: magic vars 25201 1726882700.00960: variable 'omit' from source: magic vars 25201 1726882700.01001: variable 'omit' from source: magic vars 25201 1726882700.01042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882700.01091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882700.01113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882700.01133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882700.01148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882700.01185: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882700.01193: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.01200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.01306: Set connection var ansible_shell_executable to /bin/sh 25201 1726882700.01316: Set connection var ansible_pipelining to False 25201 1726882700.01324: Set connection var ansible_connection to ssh 25201 1726882700.01333: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882700.01338: Set connection var ansible_shell_type to sh 25201 1726882700.01349: Set connection var ansible_timeout to 10 25201 1726882700.01377: variable 'ansible_shell_executable' from source: unknown 25201 1726882700.01384: variable 'ansible_connection' from source: unknown 25201 1726882700.01389: variable 'ansible_module_compression' from source: unknown 25201 1726882700.01399: variable 'ansible_shell_type' from source: unknown 25201 1726882700.01405: variable 'ansible_shell_executable' from source: unknown 25201 1726882700.01411: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.01418: variable 'ansible_pipelining' from source: unknown 25201 1726882700.01424: variable 'ansible_timeout' from source: unknown 25201 1726882700.01430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.01570: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882700.01587: variable 'omit' from source: magic vars 25201 1726882700.01595: starting attempt loop 25201 1726882700.01601: running the handler 25201 1726882700.01752: variable '__test_str' from source: task vars 25201 1726882700.01832: variable 'interface' from source: play vars 25201 1726882700.01845: variable 'ipv6_route' from source: set_fact 25201 1726882700.01859: Evaluated conditional (__test_str in ipv6_route.stdout): True 25201 1726882700.01873: handler run complete 25201 1726882700.01892: attempt loop complete, returning result 25201 1726882700.01898: _execute() done 25201 1726882700.01903: dumping result to json 25201 1726882700.01909: done dumping result, returning 25201 1726882700.01919: done running TaskExecutor() for managed_node2/TASK: Assert default ipv6 route is set [0e448fcc-3ce9-313b-197e-000000000063] 25201 1726882700.01927: sending task result for task 0e448fcc-3ce9-313b-197e-000000000063 25201 1726882700.02028: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000063 25201 1726882700.02038: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25201 1726882700.02094: no more pending results, returning what we have 25201 1726882700.02098: results queue empty 25201 1726882700.02099: checking for any_errors_fatal 25201 1726882700.02105: done checking for any_errors_fatal 25201 1726882700.02106: checking for max_fail_percentage 25201 1726882700.02108: done checking for max_fail_percentage 25201 1726882700.02109: checking to see if all hosts have failed and the running result is not ok 25201 1726882700.02109: done checking to see if all hosts have failed 25201 1726882700.02110: getting the remaining hosts for this loop 25201 1726882700.02112: done getting the remaining hosts for this loop 25201 1726882700.02116: getting the next task for host managed_node2 25201 1726882700.02123: done getting next task for host managed_node2 25201 1726882700.02125: ^ task is: TASK: Ensure ping6 command is present 25201 1726882700.02127: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882700.02130: getting variables 25201 1726882700.02132: in VariableManager get_vars() 25201 1726882700.02177: Calling all_inventory to load vars for managed_node2 25201 1726882700.02180: Calling groups_inventory to load vars for managed_node2 25201 1726882700.02183: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882700.02193: Calling all_plugins_play to load vars for managed_node2 25201 1726882700.02197: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882700.02200: Calling groups_plugins_play to load vars for managed_node2 25201 1726882700.03796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882700.07736: done with get_vars() 25201 1726882700.07757: done getting variables 25201 1726882700.07810: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Friday 20 September 2024 21:38:20 -0400 (0:00:00.081) 0:00:21.253 ****** 25201 1726882700.07837: entering _queue_task() for managed_node2/package 25201 1726882700.08134: worker is 1 (out of 1 available) 25201 1726882700.08148: exiting _queue_task() for managed_node2/package 25201 1726882700.08162: done queuing things up, now waiting for results queue to drain 25201 1726882700.08167: waiting for pending results... 25201 1726882700.09089: running TaskExecutor() for managed_node2/TASK: Ensure ping6 command is present 25201 1726882700.09287: in run() - task 0e448fcc-3ce9-313b-197e-000000000064 25201 1726882700.09303: variable 'ansible_search_path' from source: unknown 25201 1726882700.09336: calling self._execute() 25201 1726882700.09562: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.09680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.09712: variable 'omit' from source: magic vars 25201 1726882700.10836: variable 'ansible_distribution_major_version' from source: facts 25201 1726882700.10849: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882700.10856: variable 'omit' from source: magic vars 25201 1726882700.10962: variable 'omit' from source: magic vars 25201 1726882700.12260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882700.16844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882700.17170: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882700.17229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882700.17387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882700.17423: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882700.17622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882700.17649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882700.17674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882700.17711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882700.17723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882700.17828: variable '__network_is_ostree' from source: set_fact 25201 1726882700.17832: variable 'omit' from source: magic vars 25201 1726882700.17874: variable 'omit' from source: magic vars 25201 1726882700.17899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882700.17924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882700.17939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882700.17961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882700.17973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882700.18002: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882700.18006: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.18009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.18113: Set connection var ansible_shell_executable to /bin/sh 25201 1726882700.18116: Set connection var ansible_pipelining to False 25201 1726882700.18123: Set connection var ansible_connection to ssh 25201 1726882700.18128: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882700.18131: Set connection var ansible_shell_type to sh 25201 1726882700.18138: Set connection var ansible_timeout to 10 25201 1726882700.18170: variable 'ansible_shell_executable' from source: unknown 25201 1726882700.18173: variable 'ansible_connection' from source: unknown 25201 1726882700.18175: variable 'ansible_module_compression' from source: unknown 25201 1726882700.18177: variable 'ansible_shell_type' from source: unknown 25201 1726882700.18180: variable 'ansible_shell_executable' from source: unknown 25201 1726882700.18182: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882700.18185: variable 'ansible_pipelining' from source: unknown 25201 1726882700.18188: variable 'ansible_timeout' from source: unknown 25201 1726882700.18190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882700.18294: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882700.18305: variable 'omit' from source: magic vars 25201 1726882700.18308: starting attempt loop 25201 1726882700.18311: running the handler 25201 1726882700.18317: variable 'ansible_facts' from source: unknown 25201 1726882700.18320: variable 'ansible_facts' from source: unknown 25201 1726882700.18353: _low_level_execute_command(): starting 25201 1726882700.18361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882700.19137: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882700.19159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.19180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.19193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.19230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.19237: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882700.19246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.19281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882700.19284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882700.19296: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882700.19304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.19314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.19325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.19332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.19339: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882700.19351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.19458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882700.19775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882700.19785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882700.19788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882700.21283: stdout chunk (state=3): >>>/root <<< 25201 1726882700.21411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882700.21460: stderr chunk (state=3): >>><<< 25201 1726882700.21467: stdout chunk (state=3): >>><<< 25201 1726882700.21489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882700.21501: _low_level_execute_command(): starting 25201 1726882700.21506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586 `" && echo ansible-tmp-1726882700.2148938-26116-248710385906586="` echo /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586 `" ) && sleep 0' 25201 1726882700.22087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882700.22095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.22106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.22119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.22157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.22174: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882700.22183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.22196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882700.22203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882700.22210: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882700.22217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.22226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.22242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.22250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.22256: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882700.22269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.22353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882700.22369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882700.22379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882700.22517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882700.24384: stdout chunk (state=3): >>>ansible-tmp-1726882700.2148938-26116-248710385906586=/root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586 <<< 25201 1726882700.24511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882700.24588: stderr chunk (state=3): >>><<< 25201 1726882700.24591: stdout chunk (state=3): >>><<< 25201 1726882700.24670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882700.2148938-26116-248710385906586=/root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882700.24675: variable 'ansible_module_compression' from source: unknown 25201 1726882700.24870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25201 1726882700.24873: variable 'ansible_facts' from source: unknown 25201 1726882700.24876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/AnsiballZ_dnf.py 25201 1726882700.25003: Sending initial data 25201 1726882700.25006: Sent initial data (152 bytes) 25201 1726882700.26624: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882700.26646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.26656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.26675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.26724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.26746: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882700.26761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.26777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882700.26784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882700.26795: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882700.26809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.26818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.26829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.26837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.26843: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882700.26852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.26939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882700.26955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882700.26970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882700.27093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882700.28872: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882700.28972: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882700.29076: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpa_2r5mn2 /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/AnsiballZ_dnf.py <<< 25201 1726882700.29168: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882700.30877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882700.30947: stderr chunk (state=3): >>><<< 25201 1726882700.30950: stdout chunk (state=3): >>><<< 25201 1726882700.30972: done transferring module to remote 25201 1726882700.30982: _low_level_execute_command(): starting 25201 1726882700.30987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/ /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/AnsiballZ_dnf.py && sleep 0' 25201 1726882700.31706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882700.31723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.31739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.31753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.31804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.31811: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882700.31822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.31839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882700.31846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882700.31852: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882700.31860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.31873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.31885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.31894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882700.31902: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882700.31908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.32017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882700.32024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882700.32035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882700.32162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882700.33944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882700.33987: stderr chunk (state=3): >>><<< 25201 1726882700.33990: stdout chunk (state=3): >>><<< 25201 1726882700.34004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882700.34007: _low_level_execute_command(): starting 25201 1726882700.34012: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/AnsiballZ_dnf.py && sleep 0' 25201 1726882700.34432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882700.34437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882700.34451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.34493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882700.34497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882700.34499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882700.34501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882700.34552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882700.34555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882700.34663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.36526: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25201 1726882701.42358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882701.42367: stderr chunk (state=3): >>><<< 25201 1726882701.42370: stdout chunk (state=3): >>><<< 25201 1726882701.42384: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882701.42419: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882701.42423: _low_level_execute_command(): starting 25201 1726882701.42428: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882700.2148938-26116-248710385906586/ > /dev/null 2>&1 && sleep 0' 25201 1726882701.42848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.42852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.42892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.42896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.42907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.42962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.42971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882701.42973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.43069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.44881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.44922: stderr chunk (state=3): >>><<< 25201 1726882701.44925: stdout chunk (state=3): >>><<< 25201 1726882701.44941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882701.44946: handler run complete 25201 1726882701.44974: attempt loop complete, returning result 25201 1726882701.44977: _execute() done 25201 1726882701.44979: dumping result to json 25201 1726882701.44983: done dumping result, returning 25201 1726882701.44991: done running TaskExecutor() for managed_node2/TASK: Ensure ping6 command is present [0e448fcc-3ce9-313b-197e-000000000064] 25201 1726882701.44995: sending task result for task 0e448fcc-3ce9-313b-197e-000000000064 25201 1726882701.45098: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000064 25201 1726882701.45102: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25201 1726882701.45172: no more pending results, returning what we have 25201 1726882701.45175: results queue empty 25201 1726882701.45176: checking for any_errors_fatal 25201 1726882701.45182: done checking for any_errors_fatal 25201 1726882701.45182: checking for max_fail_percentage 25201 1726882701.45184: done checking for max_fail_percentage 25201 1726882701.45185: checking to see if all hosts have failed and the running result is not ok 25201 1726882701.45186: done checking to see if all hosts have failed 25201 1726882701.45186: getting the remaining hosts for this loop 25201 1726882701.45188: done getting the remaining hosts for this loop 25201 1726882701.45192: getting the next task for host managed_node2 25201 1726882701.45198: done getting next task for host managed_node2 25201 1726882701.45200: ^ task is: TASK: Test gateway can be pinged 25201 1726882701.45202: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882701.45209: getting variables 25201 1726882701.45210: in VariableManager get_vars() 25201 1726882701.45248: Calling all_inventory to load vars for managed_node2 25201 1726882701.45251: Calling groups_inventory to load vars for managed_node2 25201 1726882701.45253: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.45262: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.45266: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.45270: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.46100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882701.47175: done with get_vars() 25201 1726882701.47196: done getting variables 25201 1726882701.47248: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Friday 20 September 2024 21:38:21 -0400 (0:00:01.394) 0:00:22.647 ****** 25201 1726882701.47276: entering _queue_task() for managed_node2/command 25201 1726882701.47508: worker is 1 (out of 1 available) 25201 1726882701.47520: exiting _queue_task() for managed_node2/command 25201 1726882701.47532: done queuing things up, now waiting for results queue to drain 25201 1726882701.47533: waiting for pending results... 25201 1726882701.47785: running TaskExecutor() for managed_node2/TASK: Test gateway can be pinged 25201 1726882701.47852: in run() - task 0e448fcc-3ce9-313b-197e-000000000065 25201 1726882701.47869: variable 'ansible_search_path' from source: unknown 25201 1726882701.47902: calling self._execute() 25201 1726882701.47984: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.47989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.47999: variable 'omit' from source: magic vars 25201 1726882701.48331: variable 'ansible_distribution_major_version' from source: facts 25201 1726882701.48342: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882701.48347: variable 'omit' from source: magic vars 25201 1726882701.48371: variable 'omit' from source: magic vars 25201 1726882701.48405: variable 'omit' from source: magic vars 25201 1726882701.48442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882701.48476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882701.48494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882701.48509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882701.48523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882701.48549: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882701.48552: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.48555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.48644: Set connection var ansible_shell_executable to /bin/sh 25201 1726882701.48648: Set connection var ansible_pipelining to False 25201 1726882701.48654: Set connection var ansible_connection to ssh 25201 1726882701.48659: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882701.48661: Set connection var ansible_shell_type to sh 25201 1726882701.48671: Set connection var ansible_timeout to 10 25201 1726882701.48692: variable 'ansible_shell_executable' from source: unknown 25201 1726882701.48695: variable 'ansible_connection' from source: unknown 25201 1726882701.48697: variable 'ansible_module_compression' from source: unknown 25201 1726882701.48700: variable 'ansible_shell_type' from source: unknown 25201 1726882701.48702: variable 'ansible_shell_executable' from source: unknown 25201 1726882701.48704: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.48709: variable 'ansible_pipelining' from source: unknown 25201 1726882701.48711: variable 'ansible_timeout' from source: unknown 25201 1726882701.48716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.48832: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882701.48841: variable 'omit' from source: magic vars 25201 1726882701.48844: starting attempt loop 25201 1726882701.48847: running the handler 25201 1726882701.48862: _low_level_execute_command(): starting 25201 1726882701.48871: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882701.49387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.49399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.49438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.49441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.49444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.49497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.49500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.49603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.51195: stdout chunk (state=3): >>>/root <<< 25201 1726882701.51298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.51342: stderr chunk (state=3): >>><<< 25201 1726882701.51345: stdout chunk (state=3): >>><<< 25201 1726882701.51361: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882701.51376: _low_level_execute_command(): starting 25201 1726882701.51380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237 `" && echo ansible-tmp-1726882701.513606-26197-129438749646237="` echo /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237 `" ) && sleep 0' 25201 1726882701.51779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882701.51785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.51816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.51828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.51888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882701.51891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.52007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.53866: stdout chunk (state=3): >>>ansible-tmp-1726882701.513606-26197-129438749646237=/root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237 <<< 25201 1726882701.53976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.54016: stderr chunk (state=3): >>><<< 25201 1726882701.54019: stdout chunk (state=3): >>><<< 25201 1726882701.54032: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882701.513606-26197-129438749646237=/root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882701.54052: variable 'ansible_module_compression' from source: unknown 25201 1726882701.54096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882701.54124: variable 'ansible_facts' from source: unknown 25201 1726882701.54186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/AnsiballZ_command.py 25201 1726882701.54279: Sending initial data 25201 1726882701.54282: Sent initial data (155 bytes) 25201 1726882701.54897: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882701.54903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.54935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.54946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.54997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.55008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.55111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.56889: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882701.56987: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882701.57091: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp1tn50nwf /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/AnsiballZ_command.py <<< 25201 1726882701.57186: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882701.58205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.58289: stderr chunk (state=3): >>><<< 25201 1726882701.58292: stdout chunk (state=3): >>><<< 25201 1726882701.58306: done transferring module to remote 25201 1726882701.58313: _low_level_execute_command(): starting 25201 1726882701.58318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/ /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/AnsiballZ_command.py && sleep 0' 25201 1726882701.58715: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882701.58721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.58762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.58770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.58773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.58823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.58827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.58930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.60669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.60712: stderr chunk (state=3): >>><<< 25201 1726882701.60715: stdout chunk (state=3): >>><<< 25201 1726882701.60727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882701.60733: _low_level_execute_command(): starting 25201 1726882701.60736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/AnsiballZ_command.py && sleep 0' 25201 1726882701.61123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.61128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.61170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.61174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 25201 1726882701.61187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.61189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.61231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.61235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.61343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.74817: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:38:21.742272", "end": "2024-09-20 21:38:21.746258", "delta": "0:00:00.003986", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882701.75977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882701.76019: stderr chunk (state=3): >>><<< 25201 1726882701.76023: stdout chunk (state=3): >>><<< 25201 1726882701.76036: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:38:21.742272", "end": "2024-09-20 21:38:21.746258", "delta": "0:00:00.003986", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882701.76069: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882701.76073: _low_level_execute_command(): starting 25201 1726882701.76078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882701.513606-26197-129438749646237/ > /dev/null 2>&1 && sleep 0' 25201 1726882701.76477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882701.76482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.76491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882701.76532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.76535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882701.76537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882701.76592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882701.76596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882701.76699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882701.78475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882701.78518: stderr chunk (state=3): >>><<< 25201 1726882701.78521: stdout chunk (state=3): >>><<< 25201 1726882701.78532: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882701.78539: handler run complete 25201 1726882701.78555: Evaluated conditional (False): False 25201 1726882701.78568: attempt loop complete, returning result 25201 1726882701.78571: _execute() done 25201 1726882701.78573: dumping result to json 25201 1726882701.78575: done dumping result, returning 25201 1726882701.78582: done running TaskExecutor() for managed_node2/TASK: Test gateway can be pinged [0e448fcc-3ce9-313b-197e-000000000065] 25201 1726882701.78591: sending task result for task 0e448fcc-3ce9-313b-197e-000000000065 25201 1726882701.78682: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000065 25201 1726882701.78684: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.003986", "end": "2024-09-20 21:38:21.746258", "rc": 0, "start": "2024-09-20 21:38:21.742272" } STDOUT: PING 2001:db8::1(2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 25201 1726882701.78752: no more pending results, returning what we have 25201 1726882701.78755: results queue empty 25201 1726882701.78756: checking for any_errors_fatal 25201 1726882701.78767: done checking for any_errors_fatal 25201 1726882701.78768: checking for max_fail_percentage 25201 1726882701.78770: done checking for max_fail_percentage 25201 1726882701.78771: checking to see if all hosts have failed and the running result is not ok 25201 1726882701.78772: done checking to see if all hosts have failed 25201 1726882701.78772: getting the remaining hosts for this loop 25201 1726882701.78774: done getting the remaining hosts for this loop 25201 1726882701.78777: getting the next task for host managed_node2 25201 1726882701.78784: done getting next task for host managed_node2 25201 1726882701.78787: ^ task is: TASK: TEARDOWN: remove profiles. 25201 1726882701.78788: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882701.78791: getting variables 25201 1726882701.78792: in VariableManager get_vars() 25201 1726882701.78826: Calling all_inventory to load vars for managed_node2 25201 1726882701.78829: Calling groups_inventory to load vars for managed_node2 25201 1726882701.78831: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.78840: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.78842: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.78844: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.79768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882701.80689: done with get_vars() 25201 1726882701.80704: done getting variables 25201 1726882701.80749: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Friday 20 September 2024 21:38:21 -0400 (0:00:00.334) 0:00:22.982 ****** 25201 1726882701.80772: entering _queue_task() for managed_node2/debug 25201 1726882701.80962: worker is 1 (out of 1 available) 25201 1726882701.80978: exiting _queue_task() for managed_node2/debug 25201 1726882701.80989: done queuing things up, now waiting for results queue to drain 25201 1726882701.80990: waiting for pending results... 25201 1726882701.81155: running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. 25201 1726882701.81222: in run() - task 0e448fcc-3ce9-313b-197e-000000000066 25201 1726882701.81232: variable 'ansible_search_path' from source: unknown 25201 1726882701.81260: calling self._execute() 25201 1726882701.81332: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.81336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.81345: variable 'omit' from source: magic vars 25201 1726882701.81617: variable 'ansible_distribution_major_version' from source: facts 25201 1726882701.81627: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882701.81633: variable 'omit' from source: magic vars 25201 1726882701.81649: variable 'omit' from source: magic vars 25201 1726882701.81677: variable 'omit' from source: magic vars 25201 1726882701.81707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882701.81732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882701.81749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882701.81767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882701.81781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882701.81804: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882701.81807: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.81810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.81883: Set connection var ansible_shell_executable to /bin/sh 25201 1726882701.81887: Set connection var ansible_pipelining to False 25201 1726882701.81892: Set connection var ansible_connection to ssh 25201 1726882701.81897: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882701.81899: Set connection var ansible_shell_type to sh 25201 1726882701.81906: Set connection var ansible_timeout to 10 25201 1726882701.81921: variable 'ansible_shell_executable' from source: unknown 25201 1726882701.81924: variable 'ansible_connection' from source: unknown 25201 1726882701.81926: variable 'ansible_module_compression' from source: unknown 25201 1726882701.81929: variable 'ansible_shell_type' from source: unknown 25201 1726882701.81931: variable 'ansible_shell_executable' from source: unknown 25201 1726882701.81933: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.81937: variable 'ansible_pipelining' from source: unknown 25201 1726882701.81939: variable 'ansible_timeout' from source: unknown 25201 1726882701.81943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.82045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882701.82054: variable 'omit' from source: magic vars 25201 1726882701.82057: starting attempt loop 25201 1726882701.82059: running the handler 25201 1726882701.82101: handler run complete 25201 1726882701.82113: attempt loop complete, returning result 25201 1726882701.82116: _execute() done 25201 1726882701.82118: dumping result to json 25201 1726882701.82121: done dumping result, returning 25201 1726882701.82127: done running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. [0e448fcc-3ce9-313b-197e-000000000066] 25201 1726882701.82132: sending task result for task 0e448fcc-3ce9-313b-197e-000000000066 25201 1726882701.82212: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000066 25201 1726882701.82215: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 25201 1726882701.82261: no more pending results, returning what we have 25201 1726882701.82266: results queue empty 25201 1726882701.82267: checking for any_errors_fatal 25201 1726882701.82274: done checking for any_errors_fatal 25201 1726882701.82275: checking for max_fail_percentage 25201 1726882701.82277: done checking for max_fail_percentage 25201 1726882701.82277: checking to see if all hosts have failed and the running result is not ok 25201 1726882701.82278: done checking to see if all hosts have failed 25201 1726882701.82279: getting the remaining hosts for this loop 25201 1726882701.82280: done getting the remaining hosts for this loop 25201 1726882701.82284: getting the next task for host managed_node2 25201 1726882701.82293: done getting next task for host managed_node2 25201 1726882701.82297: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25201 1726882701.82299: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882701.82314: getting variables 25201 1726882701.82316: in VariableManager get_vars() 25201 1726882701.82352: Calling all_inventory to load vars for managed_node2 25201 1726882701.82354: Calling groups_inventory to load vars for managed_node2 25201 1726882701.82357: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.82366: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.82368: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.82370: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.83131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882701.84131: done with get_vars() 25201 1726882701.84144: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:21 -0400 (0:00:00.034) 0:00:23.016 ****** 25201 1726882701.84209: entering _queue_task() for managed_node2/include_tasks 25201 1726882701.84384: worker is 1 (out of 1 available) 25201 1726882701.84398: exiting _queue_task() for managed_node2/include_tasks 25201 1726882701.84410: done queuing things up, now waiting for results queue to drain 25201 1726882701.84412: waiting for pending results... 25201 1726882701.84573: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25201 1726882701.84656: in run() - task 0e448fcc-3ce9-313b-197e-00000000006e 25201 1726882701.84670: variable 'ansible_search_path' from source: unknown 25201 1726882701.84674: variable 'ansible_search_path' from source: unknown 25201 1726882701.84700: calling self._execute() 25201 1726882701.84762: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.84771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.84779: variable 'omit' from source: magic vars 25201 1726882701.85032: variable 'ansible_distribution_major_version' from source: facts 25201 1726882701.85041: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882701.85047: _execute() done 25201 1726882701.85051: dumping result to json 25201 1726882701.85054: done dumping result, returning 25201 1726882701.85060: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-313b-197e-00000000006e] 25201 1726882701.85066: sending task result for task 0e448fcc-3ce9-313b-197e-00000000006e 25201 1726882701.85147: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000006e 25201 1726882701.85150: WORKER PROCESS EXITING 25201 1726882701.85209: no more pending results, returning what we have 25201 1726882701.85212: in VariableManager get_vars() 25201 1726882701.85248: Calling all_inventory to load vars for managed_node2 25201 1726882701.85251: Calling groups_inventory to load vars for managed_node2 25201 1726882701.85253: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.85261: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.85264: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.85267: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.85993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882701.86913: done with get_vars() 25201 1726882701.86925: variable 'ansible_search_path' from source: unknown 25201 1726882701.86926: variable 'ansible_search_path' from source: unknown 25201 1726882701.86951: we have included files to process 25201 1726882701.86951: generating all_blocks data 25201 1726882701.86953: done generating all_blocks data 25201 1726882701.86957: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882701.86958: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882701.86959: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25201 1726882701.87345: done processing included file 25201 1726882701.87346: iterating over new_blocks loaded from include file 25201 1726882701.87347: in VariableManager get_vars() 25201 1726882701.87362: done with get_vars() 25201 1726882701.87365: filtering new block on tags 25201 1726882701.87377: done filtering new block on tags 25201 1726882701.87379: in VariableManager get_vars() 25201 1726882701.87392: done with get_vars() 25201 1726882701.87393: filtering new block on tags 25201 1726882701.87405: done filtering new block on tags 25201 1726882701.87406: in VariableManager get_vars() 25201 1726882701.87420: done with get_vars() 25201 1726882701.87421: filtering new block on tags 25201 1726882701.87432: done filtering new block on tags 25201 1726882701.87433: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 25201 1726882701.87436: extending task lists for all hosts with included blocks 25201 1726882701.87927: done extending task lists 25201 1726882701.87928: done processing included files 25201 1726882701.87929: results queue empty 25201 1726882701.87929: checking for any_errors_fatal 25201 1726882701.87931: done checking for any_errors_fatal 25201 1726882701.87932: checking for max_fail_percentage 25201 1726882701.87932: done checking for max_fail_percentage 25201 1726882701.87933: checking to see if all hosts have failed and the running result is not ok 25201 1726882701.87933: done checking to see if all hosts have failed 25201 1726882701.87934: getting the remaining hosts for this loop 25201 1726882701.87935: done getting the remaining hosts for this loop 25201 1726882701.87936: getting the next task for host managed_node2 25201 1726882701.87939: done getting next task for host managed_node2 25201 1726882701.87940: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25201 1726882701.87942: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882701.87948: getting variables 25201 1726882701.87949: in VariableManager get_vars() 25201 1726882701.87959: Calling all_inventory to load vars for managed_node2 25201 1726882701.87960: Calling groups_inventory to load vars for managed_node2 25201 1726882701.87961: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.87966: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.87968: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.87970: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.89196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882701.91443: done with get_vars() 25201 1726882701.91474: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:21 -0400 (0:00:00.073) 0:00:23.090 ****** 25201 1726882701.91554: entering _queue_task() for managed_node2/setup 25201 1726882701.91996: worker is 1 (out of 1 available) 25201 1726882701.92010: exiting _queue_task() for managed_node2/setup 25201 1726882701.92024: done queuing things up, now waiting for results queue to drain 25201 1726882701.92026: waiting for pending results... 25201 1726882701.92421: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25201 1726882701.92639: in run() - task 0e448fcc-3ce9-313b-197e-000000000513 25201 1726882701.92670: variable 'ansible_search_path' from source: unknown 25201 1726882701.92680: variable 'ansible_search_path' from source: unknown 25201 1726882701.92771: calling self._execute() 25201 1726882701.92880: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882701.92918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882701.92949: variable 'omit' from source: magic vars 25201 1726882701.93487: variable 'ansible_distribution_major_version' from source: facts 25201 1726882701.93508: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882701.93792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882701.96351: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882701.96457: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882701.96490: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882701.96516: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882701.96540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882701.96640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882701.96660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882701.96684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882701.96753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882701.96795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882701.96830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882701.96846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882701.96867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882701.96923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882701.96936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882701.97093: variable '__network_required_facts' from source: role '' defaults 25201 1726882701.97102: variable 'ansible_facts' from source: unknown 25201 1726882701.97652: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25201 1726882701.97656: when evaluation is False, skipping this task 25201 1726882701.97658: _execute() done 25201 1726882701.97661: dumping result to json 25201 1726882701.97665: done dumping result, returning 25201 1726882701.97675: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-313b-197e-000000000513] 25201 1726882701.97678: sending task result for task 0e448fcc-3ce9-313b-197e-000000000513 25201 1726882701.97759: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000513 25201 1726882701.97762: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882701.97806: no more pending results, returning what we have 25201 1726882701.97810: results queue empty 25201 1726882701.97811: checking for any_errors_fatal 25201 1726882701.97813: done checking for any_errors_fatal 25201 1726882701.97813: checking for max_fail_percentage 25201 1726882701.97815: done checking for max_fail_percentage 25201 1726882701.97816: checking to see if all hosts have failed and the running result is not ok 25201 1726882701.97816: done checking to see if all hosts have failed 25201 1726882701.97817: getting the remaining hosts for this loop 25201 1726882701.97819: done getting the remaining hosts for this loop 25201 1726882701.97823: getting the next task for host managed_node2 25201 1726882701.97832: done getting next task for host managed_node2 25201 1726882701.97835: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25201 1726882701.97839: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882701.97855: getting variables 25201 1726882701.97857: in VariableManager get_vars() 25201 1726882701.97900: Calling all_inventory to load vars for managed_node2 25201 1726882701.97903: Calling groups_inventory to load vars for managed_node2 25201 1726882701.97905: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882701.97915: Calling all_plugins_play to load vars for managed_node2 25201 1726882701.97917: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882701.97919: Calling groups_plugins_play to load vars for managed_node2 25201 1726882701.98750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882702.00608: done with get_vars() 25201 1726882702.00630: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:22 -0400 (0:00:00.091) 0:00:23.182 ****** 25201 1726882702.00742: entering _queue_task() for managed_node2/stat 25201 1726882702.01034: worker is 1 (out of 1 available) 25201 1726882702.01047: exiting _queue_task() for managed_node2/stat 25201 1726882702.01060: done queuing things up, now waiting for results queue to drain 25201 1726882702.01061: waiting for pending results... 25201 1726882702.01344: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25201 1726882702.01497: in run() - task 0e448fcc-3ce9-313b-197e-000000000515 25201 1726882702.01514: variable 'ansible_search_path' from source: unknown 25201 1726882702.01518: variable 'ansible_search_path' from source: unknown 25201 1726882702.01551: calling self._execute() 25201 1726882702.01645: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882702.01651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882702.01662: variable 'omit' from source: magic vars 25201 1726882702.02047: variable 'ansible_distribution_major_version' from source: facts 25201 1726882702.02065: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882702.02229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882702.02497: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882702.02540: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882702.02575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882702.02613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882702.02695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882702.02723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882702.02750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882702.02780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882702.02868: variable '__network_is_ostree' from source: set_fact 25201 1726882702.02877: Evaluated conditional (not __network_is_ostree is defined): False 25201 1726882702.02880: when evaluation is False, skipping this task 25201 1726882702.02883: _execute() done 25201 1726882702.02886: dumping result to json 25201 1726882702.02888: done dumping result, returning 25201 1726882702.02895: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-313b-197e-000000000515] 25201 1726882702.02900: sending task result for task 0e448fcc-3ce9-313b-197e-000000000515 25201 1726882702.02986: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000515 25201 1726882702.02988: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25201 1726882702.03086: no more pending results, returning what we have 25201 1726882702.03090: results queue empty 25201 1726882702.03091: checking for any_errors_fatal 25201 1726882702.03097: done checking for any_errors_fatal 25201 1726882702.03097: checking for max_fail_percentage 25201 1726882702.03099: done checking for max_fail_percentage 25201 1726882702.03101: checking to see if all hosts have failed and the running result is not ok 25201 1726882702.03102: done checking to see if all hosts have failed 25201 1726882702.03103: getting the remaining hosts for this loop 25201 1726882702.03105: done getting the remaining hosts for this loop 25201 1726882702.03109: getting the next task for host managed_node2 25201 1726882702.03117: done getting next task for host managed_node2 25201 1726882702.03121: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25201 1726882702.03125: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882702.03142: getting variables 25201 1726882702.03145: in VariableManager get_vars() 25201 1726882702.03187: Calling all_inventory to load vars for managed_node2 25201 1726882702.03190: Calling groups_inventory to load vars for managed_node2 25201 1726882702.03193: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882702.03203: Calling all_plugins_play to load vars for managed_node2 25201 1726882702.03206: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882702.03209: Calling groups_plugins_play to load vars for managed_node2 25201 1726882702.04744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882702.06466: done with get_vars() 25201 1726882702.06487: done getting variables 25201 1726882702.06541: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:22 -0400 (0:00:00.058) 0:00:23.240 ****** 25201 1726882702.06581: entering _queue_task() for managed_node2/set_fact 25201 1726882702.06838: worker is 1 (out of 1 available) 25201 1726882702.06851: exiting _queue_task() for managed_node2/set_fact 25201 1726882702.06867: done queuing things up, now waiting for results queue to drain 25201 1726882702.06869: waiting for pending results... 25201 1726882702.07144: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25201 1726882702.07309: in run() - task 0e448fcc-3ce9-313b-197e-000000000516 25201 1726882702.07330: variable 'ansible_search_path' from source: unknown 25201 1726882702.07337: variable 'ansible_search_path' from source: unknown 25201 1726882702.07378: calling self._execute() 25201 1726882702.07477: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882702.07490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882702.07507: variable 'omit' from source: magic vars 25201 1726882702.07879: variable 'ansible_distribution_major_version' from source: facts 25201 1726882702.07896: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882702.08067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882702.08336: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882702.08386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882702.08429: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882702.08469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882702.08556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882702.08593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882702.08629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882702.08660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882702.08758: variable '__network_is_ostree' from source: set_fact 25201 1726882702.08774: Evaluated conditional (not __network_is_ostree is defined): False 25201 1726882702.08783: when evaluation is False, skipping this task 25201 1726882702.08790: _execute() done 25201 1726882702.08797: dumping result to json 25201 1726882702.08804: done dumping result, returning 25201 1726882702.08816: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-313b-197e-000000000516] 25201 1726882702.08825: sending task result for task 0e448fcc-3ce9-313b-197e-000000000516 25201 1726882702.08926: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000516 25201 1726882702.08934: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25201 1726882702.08995: no more pending results, returning what we have 25201 1726882702.08999: results queue empty 25201 1726882702.09000: checking for any_errors_fatal 25201 1726882702.09007: done checking for any_errors_fatal 25201 1726882702.09008: checking for max_fail_percentage 25201 1726882702.09010: done checking for max_fail_percentage 25201 1726882702.09012: checking to see if all hosts have failed and the running result is not ok 25201 1726882702.09013: done checking to see if all hosts have failed 25201 1726882702.09013: getting the remaining hosts for this loop 25201 1726882702.09015: done getting the remaining hosts for this loop 25201 1726882702.09019: getting the next task for host managed_node2 25201 1726882702.09031: done getting next task for host managed_node2 25201 1726882702.09036: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25201 1726882702.09040: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882702.09058: getting variables 25201 1726882702.09060: in VariableManager get_vars() 25201 1726882702.09104: Calling all_inventory to load vars for managed_node2 25201 1726882702.09108: Calling groups_inventory to load vars for managed_node2 25201 1726882702.09110: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882702.09121: Calling all_plugins_play to load vars for managed_node2 25201 1726882702.09124: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882702.09128: Calling groups_plugins_play to load vars for managed_node2 25201 1726882702.10852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882702.12547: done with get_vars() 25201 1726882702.12570: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:22 -0400 (0:00:00.060) 0:00:23.301 ****** 25201 1726882702.12659: entering _queue_task() for managed_node2/service_facts 25201 1726882702.12899: worker is 1 (out of 1 available) 25201 1726882702.12911: exiting _queue_task() for managed_node2/service_facts 25201 1726882702.12923: done queuing things up, now waiting for results queue to drain 25201 1726882702.12924: waiting for pending results... 25201 1726882702.13202: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 25201 1726882702.13366: in run() - task 0e448fcc-3ce9-313b-197e-000000000518 25201 1726882702.13391: variable 'ansible_search_path' from source: unknown 25201 1726882702.13399: variable 'ansible_search_path' from source: unknown 25201 1726882702.13436: calling self._execute() 25201 1726882702.13529: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882702.13541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882702.13555: variable 'omit' from source: magic vars 25201 1726882702.13920: variable 'ansible_distribution_major_version' from source: facts 25201 1726882702.13938: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882702.13949: variable 'omit' from source: magic vars 25201 1726882702.14029: variable 'omit' from source: magic vars 25201 1726882702.14071: variable 'omit' from source: magic vars 25201 1726882702.14112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882702.14154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882702.14181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882702.14204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882702.14219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882702.14257: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882702.14271: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882702.14282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882702.14391: Set connection var ansible_shell_executable to /bin/sh 25201 1726882702.14403: Set connection var ansible_pipelining to False 25201 1726882702.14413: Set connection var ansible_connection to ssh 25201 1726882702.14422: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882702.14428: Set connection var ansible_shell_type to sh 25201 1726882702.14441: Set connection var ansible_timeout to 10 25201 1726882702.14475: variable 'ansible_shell_executable' from source: unknown 25201 1726882702.14484: variable 'ansible_connection' from source: unknown 25201 1726882702.14492: variable 'ansible_module_compression' from source: unknown 25201 1726882702.14499: variable 'ansible_shell_type' from source: unknown 25201 1726882702.14506: variable 'ansible_shell_executable' from source: unknown 25201 1726882702.14513: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882702.14521: variable 'ansible_pipelining' from source: unknown 25201 1726882702.14527: variable 'ansible_timeout' from source: unknown 25201 1726882702.14533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882702.14714: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882702.14727: variable 'omit' from source: magic vars 25201 1726882702.14735: starting attempt loop 25201 1726882702.14740: running the handler 25201 1726882702.14754: _low_level_execute_command(): starting 25201 1726882702.14768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882702.15496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882702.15509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.15522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.15539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.15587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.15597: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882702.15609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.15625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882702.15635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882702.15644: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882702.15660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.15681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.15698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.15712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.15725: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882702.15740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.15822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882702.15846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882702.15868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882702.16010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882702.17674: stdout chunk (state=3): >>>/root <<< 25201 1726882702.17780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882702.17860: stderr chunk (state=3): >>><<< 25201 1726882702.17877: stdout chunk (state=3): >>><<< 25201 1726882702.17976: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882702.17979: _low_level_execute_command(): starting 25201 1726882702.17982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473 `" && echo ansible-tmp-1726882702.179019-26221-193399987024473="` echo /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473 `" ) && sleep 0' 25201 1726882702.18579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882702.18595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.18609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.18631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.18678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.18691: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882702.18704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.18721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882702.18731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882702.18748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882702.18760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.18778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.18793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.18804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.18816: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882702.18828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.18912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882702.18932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882702.18946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882702.19092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882702.20990: stdout chunk (state=3): >>>ansible-tmp-1726882702.179019-26221-193399987024473=/root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473 <<< 25201 1726882702.21173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882702.21177: stdout chunk (state=3): >>><<< 25201 1726882702.21179: stderr chunk (state=3): >>><<< 25201 1726882702.21473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882702.179019-26221-193399987024473=/root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882702.21477: variable 'ansible_module_compression' from source: unknown 25201 1726882702.21479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 25201 1726882702.21482: variable 'ansible_facts' from source: unknown 25201 1726882702.21483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/AnsiballZ_service_facts.py 25201 1726882702.21719: Sending initial data 25201 1726882702.21723: Sent initial data (161 bytes) 25201 1726882702.23860: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.23865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.23903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.23906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.23909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.23976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882702.23994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882702.24122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882702.25875: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882702.25973: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882702.26070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpmszlft3e /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/AnsiballZ_service_facts.py <<< 25201 1726882702.26163: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882702.27524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882702.27772: stderr chunk (state=3): >>><<< 25201 1726882702.27781: stdout chunk (state=3): >>><<< 25201 1726882702.27784: done transferring module to remote 25201 1726882702.27786: _low_level_execute_command(): starting 25201 1726882702.27788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/ /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/AnsiballZ_service_facts.py && sleep 0' 25201 1726882702.28374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882702.28388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.28402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.28425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.28476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.28489: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882702.28504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.28522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882702.28537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882702.28557: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882702.28572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.28588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.28604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.28617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.28629: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882702.28643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.28727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882702.28748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882702.28770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882702.28903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882702.30647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882702.30720: stderr chunk (state=3): >>><<< 25201 1726882702.30731: stdout chunk (state=3): >>><<< 25201 1726882702.30769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882702.30773: _low_level_execute_command(): starting 25201 1726882702.30840: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/AnsiballZ_service_facts.py && sleep 0' 25201 1726882702.31423: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882702.31436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.31449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.31467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.31514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.31527: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882702.31540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.31556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882702.31570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882702.31581: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882702.31594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882702.31606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882702.31625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882702.31640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882702.31650: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882702.31662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882702.31744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882702.31780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882702.31800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882702.31934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882703.64366: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 25201 1726882703.64398: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25201 1726882703.65702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882703.65731: stderr chunk (state=3): >>><<< 25201 1726882703.65735: stdout chunk (state=3): >>><<< 25201 1726882703.65874: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882703.66429: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882703.66445: _low_level_execute_command(): starting 25201 1726882703.66454: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882702.179019-26221-193399987024473/ > /dev/null 2>&1 && sleep 0' 25201 1726882703.67336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882703.67869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.67885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.67902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.67944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.67955: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882703.67970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.67988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882703.68000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882703.68010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882703.68020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.68031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.68044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.68054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.68063: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882703.68078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.68151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882703.68179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882703.68194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882703.68323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882703.70198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882703.70201: stdout chunk (state=3): >>><<< 25201 1726882703.70208: stderr chunk (state=3): >>><<< 25201 1726882703.70222: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882703.70228: handler run complete 25201 1726882703.70870: variable 'ansible_facts' from source: unknown 25201 1726882703.70984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882703.71711: variable 'ansible_facts' from source: unknown 25201 1726882703.71844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882703.72156: attempt loop complete, returning result 25201 1726882703.72159: _execute() done 25201 1726882703.72162: dumping result to json 25201 1726882703.72339: done dumping result, returning 25201 1726882703.72348: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-313b-197e-000000000518] 25201 1726882703.72353: sending task result for task 0e448fcc-3ce9-313b-197e-000000000518 25201 1726882703.73478: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000518 25201 1726882703.73482: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882703.73598: no more pending results, returning what we have 25201 1726882703.73601: results queue empty 25201 1726882703.73602: checking for any_errors_fatal 25201 1726882703.73609: done checking for any_errors_fatal 25201 1726882703.73609: checking for max_fail_percentage 25201 1726882703.73611: done checking for max_fail_percentage 25201 1726882703.73612: checking to see if all hosts have failed and the running result is not ok 25201 1726882703.73613: done checking to see if all hosts have failed 25201 1726882703.73614: getting the remaining hosts for this loop 25201 1726882703.73616: done getting the remaining hosts for this loop 25201 1726882703.73620: getting the next task for host managed_node2 25201 1726882703.73628: done getting next task for host managed_node2 25201 1726882703.73632: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25201 1726882703.73638: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882703.73649: getting variables 25201 1726882703.73651: in VariableManager get_vars() 25201 1726882703.73697: Calling all_inventory to load vars for managed_node2 25201 1726882703.73700: Calling groups_inventory to load vars for managed_node2 25201 1726882703.73703: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882703.73715: Calling all_plugins_play to load vars for managed_node2 25201 1726882703.73717: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882703.73725: Calling groups_plugins_play to load vars for managed_node2 25201 1726882703.81400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882703.83255: done with get_vars() 25201 1726882703.83283: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:23 -0400 (0:00:01.707) 0:00:25.008 ****** 25201 1726882703.83372: entering _queue_task() for managed_node2/package_facts 25201 1726882703.83693: worker is 1 (out of 1 available) 25201 1726882703.83705: exiting _queue_task() for managed_node2/package_facts 25201 1726882703.83721: done queuing things up, now waiting for results queue to drain 25201 1726882703.83723: waiting for pending results... 25201 1726882703.84008: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25201 1726882703.84186: in run() - task 0e448fcc-3ce9-313b-197e-000000000519 25201 1726882703.84205: variable 'ansible_search_path' from source: unknown 25201 1726882703.84212: variable 'ansible_search_path' from source: unknown 25201 1726882703.84248: calling self._execute() 25201 1726882703.84355: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882703.84377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882703.84397: variable 'omit' from source: magic vars 25201 1726882703.84804: variable 'ansible_distribution_major_version' from source: facts 25201 1726882703.84832: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882703.84846: variable 'omit' from source: magic vars 25201 1726882703.84947: variable 'omit' from source: magic vars 25201 1726882703.84989: variable 'omit' from source: magic vars 25201 1726882703.85038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882703.85086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882703.85113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882703.85140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882703.85167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882703.85202: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882703.85210: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882703.85217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882703.85335: Set connection var ansible_shell_executable to /bin/sh 25201 1726882703.85346: Set connection var ansible_pipelining to False 25201 1726882703.85361: Set connection var ansible_connection to ssh 25201 1726882703.85380: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882703.85388: Set connection var ansible_shell_type to sh 25201 1726882703.85399: Set connection var ansible_timeout to 10 25201 1726882703.85425: variable 'ansible_shell_executable' from source: unknown 25201 1726882703.85433: variable 'ansible_connection' from source: unknown 25201 1726882703.85439: variable 'ansible_module_compression' from source: unknown 25201 1726882703.85446: variable 'ansible_shell_type' from source: unknown 25201 1726882703.85452: variable 'ansible_shell_executable' from source: unknown 25201 1726882703.85460: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882703.85479: variable 'ansible_pipelining' from source: unknown 25201 1726882703.85486: variable 'ansible_timeout' from source: unknown 25201 1726882703.85494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882703.85705: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882703.85721: variable 'omit' from source: magic vars 25201 1726882703.85731: starting attempt loop 25201 1726882703.85738: running the handler 25201 1726882703.85755: _low_level_execute_command(): starting 25201 1726882703.85771: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882703.86558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882703.86580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.86597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.86615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.86660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.86682: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882703.86696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.86713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882703.86723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882703.86733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882703.86744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.86759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.86786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.86805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.86818: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882703.86833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.86922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882703.86946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882703.86962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882703.87107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882703.88777: stdout chunk (state=3): >>>/root <<< 25201 1726882703.88960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882703.88966: stdout chunk (state=3): >>><<< 25201 1726882703.88970: stderr chunk (state=3): >>><<< 25201 1726882703.89087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882703.89090: _low_level_execute_command(): starting 25201 1726882703.89093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944 `" && echo ansible-tmp-1726882703.8899024-26282-257458909511944="` echo /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944 `" ) && sleep 0' 25201 1726882703.90508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.90512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.90546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882703.90549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.90552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.90742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882703.90748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882703.90751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882703.90857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882703.92736: stdout chunk (state=3): >>>ansible-tmp-1726882703.8899024-26282-257458909511944=/root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944 <<< 25201 1726882703.92852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882703.92928: stderr chunk (state=3): >>><<< 25201 1726882703.92931: stdout chunk (state=3): >>><<< 25201 1726882703.93174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882703.8899024-26282-257458909511944=/root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882703.93178: variable 'ansible_module_compression' from source: unknown 25201 1726882703.93180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 25201 1726882703.93182: variable 'ansible_facts' from source: unknown 25201 1726882703.93338: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/AnsiballZ_package_facts.py 25201 1726882703.94420: Sending initial data 25201 1726882703.94424: Sent initial data (162 bytes) 25201 1726882703.96582: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882703.96596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.96615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.96633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.96738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.96751: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882703.96769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.96792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882703.96805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882703.96820: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882703.96837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882703.96852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882703.96873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882703.96890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882703.96901: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882703.96913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882703.97068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882703.97122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882703.97137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882703.97275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882703.99082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882703.99179: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882703.99280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpvh9yggi_ /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/AnsiballZ_package_facts.py <<< 25201 1726882703.99376: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882704.02485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882704.02729: stderr chunk (state=3): >>><<< 25201 1726882704.02733: stdout chunk (state=3): >>><<< 25201 1726882704.02735: done transferring module to remote 25201 1726882704.02737: _low_level_execute_command(): starting 25201 1726882704.02739: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/ /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/AnsiballZ_package_facts.py && sleep 0' 25201 1726882704.03456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882704.03460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882704.03508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882704.03512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882704.03514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882704.03581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882704.03585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882704.03589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882704.03696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882704.05507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882704.05568: stderr chunk (state=3): >>><<< 25201 1726882704.05581: stdout chunk (state=3): >>><<< 25201 1726882704.05665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882704.05669: _low_level_execute_command(): starting 25201 1726882704.05672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/AnsiballZ_package_facts.py && sleep 0' 25201 1726882704.06205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882704.06219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882704.06234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882704.06253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882704.06298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882704.06310: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882704.06325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882704.06342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882704.06355: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882704.06369: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882704.06382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882704.06396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882704.06411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882704.06424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882704.06435: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882704.06448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882704.06524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882704.06541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882704.06557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882704.06755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882704.52852: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 25201 1726882704.52887: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 25201 1726882704.52895: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 25201 1726882704.52926: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 25201 1726882704.52941: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 25201 1726882704.52980: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 25201 1726882704.52986: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 25201 1726882704.53002: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 25201 1726882704.53019: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 25201 1726882704.53025: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 25201 1726882704.53030: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 25201 1726882704.53033: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 25201 1726882704.53058: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 25201 1726882704.53077: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25201 1726882704.54496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882704.54552: stderr chunk (state=3): >>><<< 25201 1726882704.54555: stdout chunk (state=3): >>><<< 25201 1726882704.54594: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882704.56028: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882704.56044: _low_level_execute_command(): starting 25201 1726882704.56049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882703.8899024-26282-257458909511944/ > /dev/null 2>&1 && sleep 0' 25201 1726882704.56498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882704.56505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882704.56532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882704.56544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882704.56594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882704.56606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882704.56617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882704.56723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882704.58548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882704.58590: stderr chunk (state=3): >>><<< 25201 1726882704.58593: stdout chunk (state=3): >>><<< 25201 1726882704.58604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882704.58609: handler run complete 25201 1726882704.59106: variable 'ansible_facts' from source: unknown 25201 1726882704.59402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.60580: variable 'ansible_facts' from source: unknown 25201 1726882704.60895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.61335: attempt loop complete, returning result 25201 1726882704.61346: _execute() done 25201 1726882704.61349: dumping result to json 25201 1726882704.61482: done dumping result, returning 25201 1726882704.61490: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-313b-197e-000000000519] 25201 1726882704.61495: sending task result for task 0e448fcc-3ce9-313b-197e-000000000519 25201 1726882704.62772: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000519 25201 1726882704.62775: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882704.62857: no more pending results, returning what we have 25201 1726882704.62859: results queue empty 25201 1726882704.62860: checking for any_errors_fatal 25201 1726882704.62870: done checking for any_errors_fatal 25201 1726882704.62871: checking for max_fail_percentage 25201 1726882704.62872: done checking for max_fail_percentage 25201 1726882704.62872: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.62873: done checking to see if all hosts have failed 25201 1726882704.62873: getting the remaining hosts for this loop 25201 1726882704.62874: done getting the remaining hosts for this loop 25201 1726882704.62877: getting the next task for host managed_node2 25201 1726882704.62882: done getting next task for host managed_node2 25201 1726882704.62884: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25201 1726882704.62886: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.62893: getting variables 25201 1726882704.62894: in VariableManager get_vars() 25201 1726882704.62919: Calling all_inventory to load vars for managed_node2 25201 1726882704.62921: Calling groups_inventory to load vars for managed_node2 25201 1726882704.62922: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.62929: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.62930: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.62932: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.63678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.64612: done with get_vars() 25201 1726882704.64628: done getting variables 25201 1726882704.64675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:24 -0400 (0:00:00.813) 0:00:25.821 ****** 25201 1726882704.64700: entering _queue_task() for managed_node2/debug 25201 1726882704.64911: worker is 1 (out of 1 available) 25201 1726882704.64925: exiting _queue_task() for managed_node2/debug 25201 1726882704.64937: done queuing things up, now waiting for results queue to drain 25201 1726882704.64938: waiting for pending results... 25201 1726882704.65117: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 25201 1726882704.65201: in run() - task 0e448fcc-3ce9-313b-197e-00000000006f 25201 1726882704.65212: variable 'ansible_search_path' from source: unknown 25201 1726882704.65216: variable 'ansible_search_path' from source: unknown 25201 1726882704.65245: calling self._execute() 25201 1726882704.65319: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.65323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.65333: variable 'omit' from source: magic vars 25201 1726882704.65622: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.65633: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.65639: variable 'omit' from source: magic vars 25201 1726882704.65682: variable 'omit' from source: magic vars 25201 1726882704.65751: variable 'network_provider' from source: set_fact 25201 1726882704.65769: variable 'omit' from source: magic vars 25201 1726882704.65806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882704.65833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882704.65847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882704.65860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882704.65873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882704.65897: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882704.65900: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.65905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.65976: Set connection var ansible_shell_executable to /bin/sh 25201 1726882704.65979: Set connection var ansible_pipelining to False 25201 1726882704.65984: Set connection var ansible_connection to ssh 25201 1726882704.65989: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882704.65991: Set connection var ansible_shell_type to sh 25201 1726882704.65998: Set connection var ansible_timeout to 10 25201 1726882704.66019: variable 'ansible_shell_executable' from source: unknown 25201 1726882704.66023: variable 'ansible_connection' from source: unknown 25201 1726882704.66027: variable 'ansible_module_compression' from source: unknown 25201 1726882704.66029: variable 'ansible_shell_type' from source: unknown 25201 1726882704.66032: variable 'ansible_shell_executable' from source: unknown 25201 1726882704.66034: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.66038: variable 'ansible_pipelining' from source: unknown 25201 1726882704.66040: variable 'ansible_timeout' from source: unknown 25201 1726882704.66043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.66134: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882704.66143: variable 'omit' from source: magic vars 25201 1726882704.66146: starting attempt loop 25201 1726882704.66149: running the handler 25201 1726882704.66187: handler run complete 25201 1726882704.66197: attempt loop complete, returning result 25201 1726882704.66200: _execute() done 25201 1726882704.66202: dumping result to json 25201 1726882704.66205: done dumping result, returning 25201 1726882704.66212: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-313b-197e-00000000006f] 25201 1726882704.66217: sending task result for task 0e448fcc-3ce9-313b-197e-00000000006f 25201 1726882704.66301: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000006f 25201 1726882704.66303: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 25201 1726882704.66370: no more pending results, returning what we have 25201 1726882704.66373: results queue empty 25201 1726882704.66374: checking for any_errors_fatal 25201 1726882704.66383: done checking for any_errors_fatal 25201 1726882704.66384: checking for max_fail_percentage 25201 1726882704.66386: done checking for max_fail_percentage 25201 1726882704.66386: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.66387: done checking to see if all hosts have failed 25201 1726882704.66388: getting the remaining hosts for this loop 25201 1726882704.66389: done getting the remaining hosts for this loop 25201 1726882704.66392: getting the next task for host managed_node2 25201 1726882704.66397: done getting next task for host managed_node2 25201 1726882704.66401: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25201 1726882704.66404: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.66413: getting variables 25201 1726882704.66414: in VariableManager get_vars() 25201 1726882704.66445: Calling all_inventory to load vars for managed_node2 25201 1726882704.66452: Calling groups_inventory to load vars for managed_node2 25201 1726882704.66455: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.66461: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.66467: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.66469: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.67217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.68250: done with get_vars() 25201 1726882704.68268: done getting variables 25201 1726882704.68306: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:24 -0400 (0:00:00.036) 0:00:25.857 ****** 25201 1726882704.68330: entering _queue_task() for managed_node2/fail 25201 1726882704.68517: worker is 1 (out of 1 available) 25201 1726882704.68530: exiting _queue_task() for managed_node2/fail 25201 1726882704.68540: done queuing things up, now waiting for results queue to drain 25201 1726882704.68542: waiting for pending results... 25201 1726882704.68707: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25201 1726882704.68792: in run() - task 0e448fcc-3ce9-313b-197e-000000000070 25201 1726882704.68800: variable 'ansible_search_path' from source: unknown 25201 1726882704.68804: variable 'ansible_search_path' from source: unknown 25201 1726882704.68832: calling self._execute() 25201 1726882704.68898: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.68903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.68911: variable 'omit' from source: magic vars 25201 1726882704.69172: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.69198: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.69341: variable 'network_state' from source: role '' defaults 25201 1726882704.69349: Evaluated conditional (network_state != {}): False 25201 1726882704.69352: when evaluation is False, skipping this task 25201 1726882704.69354: _execute() done 25201 1726882704.69357: dumping result to json 25201 1726882704.69359: done dumping result, returning 25201 1726882704.69367: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-313b-197e-000000000070] 25201 1726882704.69374: sending task result for task 0e448fcc-3ce9-313b-197e-000000000070 25201 1726882704.69460: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000070 25201 1726882704.69465: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882704.69529: no more pending results, returning what we have 25201 1726882704.69532: results queue empty 25201 1726882704.69533: checking for any_errors_fatal 25201 1726882704.69537: done checking for any_errors_fatal 25201 1726882704.69538: checking for max_fail_percentage 25201 1726882704.69539: done checking for max_fail_percentage 25201 1726882704.69540: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.69541: done checking to see if all hosts have failed 25201 1726882704.69541: getting the remaining hosts for this loop 25201 1726882704.69543: done getting the remaining hosts for this loop 25201 1726882704.69546: getting the next task for host managed_node2 25201 1726882704.69550: done getting next task for host managed_node2 25201 1726882704.69554: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25201 1726882704.69557: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.69572: getting variables 25201 1726882704.69574: in VariableManager get_vars() 25201 1726882704.69604: Calling all_inventory to load vars for managed_node2 25201 1726882704.69607: Calling groups_inventory to load vars for managed_node2 25201 1726882704.69608: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.69614: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.69616: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.69618: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.70352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.71772: done with get_vars() 25201 1726882704.71791: done getting variables 25201 1726882704.71846: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:24 -0400 (0:00:00.035) 0:00:25.893 ****** 25201 1726882704.71879: entering _queue_task() for managed_node2/fail 25201 1726882704.72104: worker is 1 (out of 1 available) 25201 1726882704.72118: exiting _queue_task() for managed_node2/fail 25201 1726882704.72129: done queuing things up, now waiting for results queue to drain 25201 1726882704.72130: waiting for pending results... 25201 1726882704.72380: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25201 1726882704.72482: in run() - task 0e448fcc-3ce9-313b-197e-000000000071 25201 1726882704.72498: variable 'ansible_search_path' from source: unknown 25201 1726882704.72501: variable 'ansible_search_path' from source: unknown 25201 1726882704.72527: calling self._execute() 25201 1726882704.72603: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.72606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.72615: variable 'omit' from source: magic vars 25201 1726882704.72888: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.72897: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.72985: variable 'network_state' from source: role '' defaults 25201 1726882704.72993: Evaluated conditional (network_state != {}): False 25201 1726882704.72996: when evaluation is False, skipping this task 25201 1726882704.72999: _execute() done 25201 1726882704.73001: dumping result to json 25201 1726882704.73003: done dumping result, returning 25201 1726882704.73013: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-313b-197e-000000000071] 25201 1726882704.73016: sending task result for task 0e448fcc-3ce9-313b-197e-000000000071 25201 1726882704.73107: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000071 25201 1726882704.73110: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882704.73160: no more pending results, returning what we have 25201 1726882704.73167: results queue empty 25201 1726882704.73168: checking for any_errors_fatal 25201 1726882704.73173: done checking for any_errors_fatal 25201 1726882704.73174: checking for max_fail_percentage 25201 1726882704.73175: done checking for max_fail_percentage 25201 1726882704.73176: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.73177: done checking to see if all hosts have failed 25201 1726882704.73177: getting the remaining hosts for this loop 25201 1726882704.73179: done getting the remaining hosts for this loop 25201 1726882704.73182: getting the next task for host managed_node2 25201 1726882704.73187: done getting next task for host managed_node2 25201 1726882704.73191: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25201 1726882704.73193: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.73207: getting variables 25201 1726882704.73209: in VariableManager get_vars() 25201 1726882704.73238: Calling all_inventory to load vars for managed_node2 25201 1726882704.73240: Calling groups_inventory to load vars for managed_node2 25201 1726882704.73241: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.73247: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.73249: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.73251: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.74190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.75860: done with get_vars() 25201 1726882704.75884: done getting variables 25201 1726882704.75936: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:24 -0400 (0:00:00.040) 0:00:25.934 ****** 25201 1726882704.75973: entering _queue_task() for managed_node2/fail 25201 1726882704.76199: worker is 1 (out of 1 available) 25201 1726882704.76211: exiting _queue_task() for managed_node2/fail 25201 1726882704.76221: done queuing things up, now waiting for results queue to drain 25201 1726882704.76222: waiting for pending results... 25201 1726882704.76496: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25201 1726882704.76633: in run() - task 0e448fcc-3ce9-313b-197e-000000000072 25201 1726882704.76649: variable 'ansible_search_path' from source: unknown 25201 1726882704.76656: variable 'ansible_search_path' from source: unknown 25201 1726882704.76701: calling self._execute() 25201 1726882704.76798: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.76809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.76822: variable 'omit' from source: magic vars 25201 1726882704.77208: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.77224: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.77404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882704.79840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882704.79912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882704.79953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882704.79996: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882704.80029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882704.80111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.80159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.80192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.80239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.80260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.80361: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.80383: Evaluated conditional (ansible_distribution_major_version | int > 9): False 25201 1726882704.80391: when evaluation is False, skipping this task 25201 1726882704.80397: _execute() done 25201 1726882704.80402: dumping result to json 25201 1726882704.80408: done dumping result, returning 25201 1726882704.80424: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-313b-197e-000000000072] 25201 1726882704.80432: sending task result for task 0e448fcc-3ce9-313b-197e-000000000072 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 25201 1726882704.80577: no more pending results, returning what we have 25201 1726882704.80581: results queue empty 25201 1726882704.80582: checking for any_errors_fatal 25201 1726882704.80588: done checking for any_errors_fatal 25201 1726882704.80589: checking for max_fail_percentage 25201 1726882704.80591: done checking for max_fail_percentage 25201 1726882704.80591: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.80593: done checking to see if all hosts have failed 25201 1726882704.80593: getting the remaining hosts for this loop 25201 1726882704.80595: done getting the remaining hosts for this loop 25201 1726882704.80599: getting the next task for host managed_node2 25201 1726882704.80607: done getting next task for host managed_node2 25201 1726882704.80611: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25201 1726882704.80614: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.80632: getting variables 25201 1726882704.80634: in VariableManager get_vars() 25201 1726882704.80680: Calling all_inventory to load vars for managed_node2 25201 1726882704.80683: Calling groups_inventory to load vars for managed_node2 25201 1726882704.80686: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.80696: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.80700: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.80702: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.81707: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000072 25201 1726882704.81710: WORKER PROCESS EXITING 25201 1726882704.82559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.84293: done with get_vars() 25201 1726882704.84316: done getting variables 25201 1726882704.84383: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:24 -0400 (0:00:00.084) 0:00:26.018 ****** 25201 1726882704.84416: entering _queue_task() for managed_node2/dnf 25201 1726882704.84726: worker is 1 (out of 1 available) 25201 1726882704.84739: exiting _queue_task() for managed_node2/dnf 25201 1726882704.84751: done queuing things up, now waiting for results queue to drain 25201 1726882704.84752: waiting for pending results... 25201 1726882704.85050: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25201 1726882704.85199: in run() - task 0e448fcc-3ce9-313b-197e-000000000073 25201 1726882704.85224: variable 'ansible_search_path' from source: unknown 25201 1726882704.85232: variable 'ansible_search_path' from source: unknown 25201 1726882704.85273: calling self._execute() 25201 1726882704.85378: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.85390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.85404: variable 'omit' from source: magic vars 25201 1726882704.85808: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.85824: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.86026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882704.88719: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882704.88792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882704.88832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882704.88872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882704.88911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882704.88990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.89034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.89068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.89124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.89144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.89268: variable 'ansible_distribution' from source: facts 25201 1726882704.89279: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.89296: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25201 1726882704.89417: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882704.89569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.89599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.89623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.89668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.89689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.89732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.89766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.89794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.89829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.89846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.89893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.89916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.89943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.89991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.90010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.90175: variable 'network_connections' from source: task vars 25201 1726882704.90194: variable 'interface' from source: play vars 25201 1726882704.90263: variable 'interface' from source: play vars 25201 1726882704.90358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882704.90534: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882704.90576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882704.90612: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882704.90652: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882704.90699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882704.90726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882704.90772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.90801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882704.90855: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882704.91104: variable 'network_connections' from source: task vars 25201 1726882704.91113: variable 'interface' from source: play vars 25201 1726882704.91182: variable 'interface' from source: play vars 25201 1726882704.91208: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882704.91216: when evaluation is False, skipping this task 25201 1726882704.91222: _execute() done 25201 1726882704.91227: dumping result to json 25201 1726882704.91233: done dumping result, returning 25201 1726882704.91243: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-000000000073] 25201 1726882704.91251: sending task result for task 0e448fcc-3ce9-313b-197e-000000000073 25201 1726882704.91356: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000073 25201 1726882704.91365: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882704.91432: no more pending results, returning what we have 25201 1726882704.91436: results queue empty 25201 1726882704.91437: checking for any_errors_fatal 25201 1726882704.91442: done checking for any_errors_fatal 25201 1726882704.91443: checking for max_fail_percentage 25201 1726882704.91445: done checking for max_fail_percentage 25201 1726882704.91446: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.91447: done checking to see if all hosts have failed 25201 1726882704.91448: getting the remaining hosts for this loop 25201 1726882704.91449: done getting the remaining hosts for this loop 25201 1726882704.91453: getting the next task for host managed_node2 25201 1726882704.91461: done getting next task for host managed_node2 25201 1726882704.91466: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25201 1726882704.91469: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882704.91488: getting variables 25201 1726882704.91490: in VariableManager get_vars() 25201 1726882704.91528: Calling all_inventory to load vars for managed_node2 25201 1726882704.91530: Calling groups_inventory to load vars for managed_node2 25201 1726882704.91533: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882704.91542: Calling all_plugins_play to load vars for managed_node2 25201 1726882704.91546: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882704.91548: Calling groups_plugins_play to load vars for managed_node2 25201 1726882704.93309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882704.95049: done with get_vars() 25201 1726882704.95072: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25201 1726882704.95148: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:24 -0400 (0:00:00.107) 0:00:26.126 ****** 25201 1726882704.95179: entering _queue_task() for managed_node2/yum 25201 1726882704.95461: worker is 1 (out of 1 available) 25201 1726882704.95476: exiting _queue_task() for managed_node2/yum 25201 1726882704.95487: done queuing things up, now waiting for results queue to drain 25201 1726882704.95489: waiting for pending results... 25201 1726882704.95770: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25201 1726882704.95926: in run() - task 0e448fcc-3ce9-313b-197e-000000000074 25201 1726882704.95948: variable 'ansible_search_path' from source: unknown 25201 1726882704.95958: variable 'ansible_search_path' from source: unknown 25201 1726882704.96008: calling self._execute() 25201 1726882704.96103: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882704.96118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882704.96130: variable 'omit' from source: magic vars 25201 1726882704.96516: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.96532: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882704.96724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882704.99147: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882704.99224: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882704.99272: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882704.99316: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882704.99347: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882704.99435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882704.99488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882704.99524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882704.99573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882704.99598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882704.99705: variable 'ansible_distribution_major_version' from source: facts 25201 1726882704.99726: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25201 1726882704.99736: when evaluation is False, skipping this task 25201 1726882704.99743: _execute() done 25201 1726882704.99750: dumping result to json 25201 1726882704.99757: done dumping result, returning 25201 1726882704.99771: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-000000000074] 25201 1726882704.99781: sending task result for task 0e448fcc-3ce9-313b-197e-000000000074 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25201 1726882704.99942: no more pending results, returning what we have 25201 1726882704.99946: results queue empty 25201 1726882704.99947: checking for any_errors_fatal 25201 1726882704.99954: done checking for any_errors_fatal 25201 1726882704.99956: checking for max_fail_percentage 25201 1726882704.99958: done checking for max_fail_percentage 25201 1726882704.99959: checking to see if all hosts have failed and the running result is not ok 25201 1726882704.99960: done checking to see if all hosts have failed 25201 1726882704.99960: getting the remaining hosts for this loop 25201 1726882704.99962: done getting the remaining hosts for this loop 25201 1726882704.99968: getting the next task for host managed_node2 25201 1726882704.99976: done getting next task for host managed_node2 25201 1726882704.99981: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25201 1726882704.99984: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.00001: getting variables 25201 1726882705.00003: in VariableManager get_vars() 25201 1726882705.00043: Calling all_inventory to load vars for managed_node2 25201 1726882705.00046: Calling groups_inventory to load vars for managed_node2 25201 1726882705.00049: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.00059: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.00062: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.00067: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.01111: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000074 25201 1726882705.01114: WORKER PROCESS EXITING 25201 1726882705.01806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.03743: done with get_vars() 25201 1726882705.03768: done getting variables 25201 1726882705.03826: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:25 -0400 (0:00:00.086) 0:00:26.213 ****** 25201 1726882705.03867: entering _queue_task() for managed_node2/fail 25201 1726882705.04138: worker is 1 (out of 1 available) 25201 1726882705.04151: exiting _queue_task() for managed_node2/fail 25201 1726882705.04168: done queuing things up, now waiting for results queue to drain 25201 1726882705.04170: waiting for pending results... 25201 1726882705.04462: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25201 1726882705.04622: in run() - task 0e448fcc-3ce9-313b-197e-000000000075 25201 1726882705.04641: variable 'ansible_search_path' from source: unknown 25201 1726882705.04649: variable 'ansible_search_path' from source: unknown 25201 1726882705.04692: calling self._execute() 25201 1726882705.04796: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.04809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.04829: variable 'omit' from source: magic vars 25201 1726882705.05225: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.05246: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.05384: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.05592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882705.08025: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882705.08099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882705.08137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882705.08180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882705.08213: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882705.08295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.08346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.08380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.08433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.08453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.08509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.08542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.08575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.08626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.08648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.08695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.08728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.08762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.08809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.08834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.09021: variable 'network_connections' from source: task vars 25201 1726882705.09043: variable 'interface' from source: play vars 25201 1726882705.09116: variable 'interface' from source: play vars 25201 1726882705.09200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882705.09376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882705.09421: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882705.09454: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882705.09497: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882705.09541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882705.09570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882705.09608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.09638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882705.09695: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882705.09962: variable 'network_connections' from source: task vars 25201 1726882705.09976: variable 'interface' from source: play vars 25201 1726882705.10045: variable 'interface' from source: play vars 25201 1726882705.10075: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882705.10083: when evaluation is False, skipping this task 25201 1726882705.10089: _execute() done 25201 1726882705.10095: dumping result to json 25201 1726882705.10102: done dumping result, returning 25201 1726882705.10111: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-000000000075] 25201 1726882705.10119: sending task result for task 0e448fcc-3ce9-313b-197e-000000000075 25201 1726882705.10233: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000075 25201 1726882705.10244: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882705.10296: no more pending results, returning what we have 25201 1726882705.10300: results queue empty 25201 1726882705.10301: checking for any_errors_fatal 25201 1726882705.10306: done checking for any_errors_fatal 25201 1726882705.10307: checking for max_fail_percentage 25201 1726882705.10309: done checking for max_fail_percentage 25201 1726882705.10310: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.10311: done checking to see if all hosts have failed 25201 1726882705.10312: getting the remaining hosts for this loop 25201 1726882705.10313: done getting the remaining hosts for this loop 25201 1726882705.10317: getting the next task for host managed_node2 25201 1726882705.10325: done getting next task for host managed_node2 25201 1726882705.10328: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25201 1726882705.10331: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.10351: getting variables 25201 1726882705.10353: in VariableManager get_vars() 25201 1726882705.10393: Calling all_inventory to load vars for managed_node2 25201 1726882705.10397: Calling groups_inventory to load vars for managed_node2 25201 1726882705.10399: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.10410: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.10413: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.10416: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.12078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.13794: done with get_vars() 25201 1726882705.13815: done getting variables 25201 1726882705.13874: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:25 -0400 (0:00:00.100) 0:00:26.313 ****** 25201 1726882705.13908: entering _queue_task() for managed_node2/package 25201 1726882705.14150: worker is 1 (out of 1 available) 25201 1726882705.14163: exiting _queue_task() for managed_node2/package 25201 1726882705.14178: done queuing things up, now waiting for results queue to drain 25201 1726882705.14179: waiting for pending results... 25201 1726882705.14439: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 25201 1726882705.14583: in run() - task 0e448fcc-3ce9-313b-197e-000000000076 25201 1726882705.14599: variable 'ansible_search_path' from source: unknown 25201 1726882705.14607: variable 'ansible_search_path' from source: unknown 25201 1726882705.14646: calling self._execute() 25201 1726882705.14744: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.14756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.14772: variable 'omit' from source: magic vars 25201 1726882705.15121: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.15137: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.15334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882705.15595: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882705.15644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882705.15686: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882705.15743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882705.15851: variable 'network_packages' from source: role '' defaults 25201 1726882705.15967: variable '__network_provider_setup' from source: role '' defaults 25201 1726882705.15982: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882705.16051: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882705.16066: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882705.16130: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882705.16320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882705.18496: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882705.18537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882705.18562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882705.18588: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882705.18608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882705.18662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.18687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.18704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.18733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.18743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.18776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.18792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.18808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.18834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.18848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.18986: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25201 1726882705.19056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.19077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.19094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.19118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.19128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.19191: variable 'ansible_python' from source: facts 25201 1726882705.19209: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25201 1726882705.19268: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882705.19322: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882705.19405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.19423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.19440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.19468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.19482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.19513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.19532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.19548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.19576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.19590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.19742: variable 'network_connections' from source: task vars 25201 1726882705.19752: variable 'interface' from source: play vars 25201 1726882705.19896: variable 'interface' from source: play vars 25201 1726882705.19979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882705.20010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882705.20045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.20094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882705.20141: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.20475: variable 'network_connections' from source: task vars 25201 1726882705.20487: variable 'interface' from source: play vars 25201 1726882705.20704: variable 'interface' from source: play vars 25201 1726882705.20737: variable '__network_packages_default_wireless' from source: role '' defaults 25201 1726882705.20794: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.21022: variable 'network_connections' from source: task vars 25201 1726882705.21025: variable 'interface' from source: play vars 25201 1726882705.21076: variable 'interface' from source: play vars 25201 1726882705.21092: variable '__network_packages_default_team' from source: role '' defaults 25201 1726882705.21144: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882705.21336: variable 'network_connections' from source: task vars 25201 1726882705.21339: variable 'interface' from source: play vars 25201 1726882705.21389: variable 'interface' from source: play vars 25201 1726882705.21424: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882705.21468: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882705.21472: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882705.21517: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882705.21649: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25201 1726882705.21946: variable 'network_connections' from source: task vars 25201 1726882705.21950: variable 'interface' from source: play vars 25201 1726882705.21995: variable 'interface' from source: play vars 25201 1726882705.22001: variable 'ansible_distribution' from source: facts 25201 1726882705.22004: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.22010: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.22026: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25201 1726882705.22129: variable 'ansible_distribution' from source: facts 25201 1726882705.22132: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.22134: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.22148: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25201 1726882705.22252: variable 'ansible_distribution' from source: facts 25201 1726882705.22257: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.22260: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.22286: variable 'network_provider' from source: set_fact 25201 1726882705.22296: variable 'ansible_facts' from source: unknown 25201 1726882705.22653: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25201 1726882705.22659: when evaluation is False, skipping this task 25201 1726882705.22661: _execute() done 25201 1726882705.22686: dumping result to json 25201 1726882705.22689: done dumping result, returning 25201 1726882705.22692: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-313b-197e-000000000076] 25201 1726882705.22694: sending task result for task 0e448fcc-3ce9-313b-197e-000000000076 25201 1726882705.22787: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000076 25201 1726882705.22790: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25201 1726882705.22852: no more pending results, returning what we have 25201 1726882705.22856: results queue empty 25201 1726882705.22857: checking for any_errors_fatal 25201 1726882705.22867: done checking for any_errors_fatal 25201 1726882705.22868: checking for max_fail_percentage 25201 1726882705.22871: done checking for max_fail_percentage 25201 1726882705.22871: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.22872: done checking to see if all hosts have failed 25201 1726882705.22873: getting the remaining hosts for this loop 25201 1726882705.22874: done getting the remaining hosts for this loop 25201 1726882705.22878: getting the next task for host managed_node2 25201 1726882705.22884: done getting next task for host managed_node2 25201 1726882705.22887: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25201 1726882705.22890: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.22906: getting variables 25201 1726882705.22907: in VariableManager get_vars() 25201 1726882705.22941: Calling all_inventory to load vars for managed_node2 25201 1726882705.22943: Calling groups_inventory to load vars for managed_node2 25201 1726882705.22945: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.22954: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.22956: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.22959: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.24539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.25620: done with get_vars() 25201 1726882705.25635: done getting variables 25201 1726882705.25679: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:25 -0400 (0:00:00.117) 0:00:26.431 ****** 25201 1726882705.25705: entering _queue_task() for managed_node2/package 25201 1726882705.25912: worker is 1 (out of 1 available) 25201 1726882705.25926: exiting _queue_task() for managed_node2/package 25201 1726882705.25938: done queuing things up, now waiting for results queue to drain 25201 1726882705.25940: waiting for pending results... 25201 1726882705.26099: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25201 1726882705.26191: in run() - task 0e448fcc-3ce9-313b-197e-000000000077 25201 1726882705.26200: variable 'ansible_search_path' from source: unknown 25201 1726882705.26204: variable 'ansible_search_path' from source: unknown 25201 1726882705.26231: calling self._execute() 25201 1726882705.26308: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.26312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.26321: variable 'omit' from source: magic vars 25201 1726882705.26626: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.26636: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.26742: variable 'network_state' from source: role '' defaults 25201 1726882705.26750: Evaluated conditional (network_state != {}): False 25201 1726882705.26753: when evaluation is False, skipping this task 25201 1726882705.26755: _execute() done 25201 1726882705.26759: dumping result to json 25201 1726882705.26761: done dumping result, returning 25201 1726882705.26775: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-313b-197e-000000000077] 25201 1726882705.26812: sending task result for task 0e448fcc-3ce9-313b-197e-000000000077 25201 1726882705.26888: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000077 25201 1726882705.26891: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882705.26961: no more pending results, returning what we have 25201 1726882705.26967: results queue empty 25201 1726882705.26969: checking for any_errors_fatal 25201 1726882705.26973: done checking for any_errors_fatal 25201 1726882705.26974: checking for max_fail_percentage 25201 1726882705.26976: done checking for max_fail_percentage 25201 1726882705.26977: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.26977: done checking to see if all hosts have failed 25201 1726882705.26978: getting the remaining hosts for this loop 25201 1726882705.26980: done getting the remaining hosts for this loop 25201 1726882705.26985: getting the next task for host managed_node2 25201 1726882705.27039: done getting next task for host managed_node2 25201 1726882705.27043: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25201 1726882705.27046: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.27061: getting variables 25201 1726882705.27063: in VariableManager get_vars() 25201 1726882705.27099: Calling all_inventory to load vars for managed_node2 25201 1726882705.27101: Calling groups_inventory to load vars for managed_node2 25201 1726882705.27104: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.27112: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.27115: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.27117: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.28506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.30256: done with get_vars() 25201 1726882705.30281: done getting variables 25201 1726882705.30337: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:25 -0400 (0:00:00.046) 0:00:26.478 ****** 25201 1726882705.30372: entering _queue_task() for managed_node2/package 25201 1726882705.30617: worker is 1 (out of 1 available) 25201 1726882705.30630: exiting _queue_task() for managed_node2/package 25201 1726882705.30641: done queuing things up, now waiting for results queue to drain 25201 1726882705.30643: waiting for pending results... 25201 1726882705.30920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25201 1726882705.31044: in run() - task 0e448fcc-3ce9-313b-197e-000000000078 25201 1726882705.31058: variable 'ansible_search_path' from source: unknown 25201 1726882705.31061: variable 'ansible_search_path' from source: unknown 25201 1726882705.31102: calling self._execute() 25201 1726882705.31188: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.31192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.31242: variable 'omit' from source: magic vars 25201 1726882705.31675: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.31678: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.31859: variable 'network_state' from source: role '' defaults 25201 1726882705.31879: Evaluated conditional (network_state != {}): False 25201 1726882705.31882: when evaluation is False, skipping this task 25201 1726882705.31885: _execute() done 25201 1726882705.31887: dumping result to json 25201 1726882705.31891: done dumping result, returning 25201 1726882705.31897: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-313b-197e-000000000078] 25201 1726882705.31904: sending task result for task 0e448fcc-3ce9-313b-197e-000000000078 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882705.32050: no more pending results, returning what we have 25201 1726882705.32054: results queue empty 25201 1726882705.32055: checking for any_errors_fatal 25201 1726882705.32062: done checking for any_errors_fatal 25201 1726882705.32065: checking for max_fail_percentage 25201 1726882705.32068: done checking for max_fail_percentage 25201 1726882705.32069: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.32070: done checking to see if all hosts have failed 25201 1726882705.32071: getting the remaining hosts for this loop 25201 1726882705.32073: done getting the remaining hosts for this loop 25201 1726882705.32077: getting the next task for host managed_node2 25201 1726882705.32085: done getting next task for host managed_node2 25201 1726882705.32089: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25201 1726882705.32092: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.32110: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000078 25201 1726882705.32114: WORKER PROCESS EXITING 25201 1726882705.32125: getting variables 25201 1726882705.32128: in VariableManager get_vars() 25201 1726882705.32169: Calling all_inventory to load vars for managed_node2 25201 1726882705.32172: Calling groups_inventory to load vars for managed_node2 25201 1726882705.32175: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.32187: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.32191: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.32194: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.33903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.35033: done with get_vars() 25201 1726882705.35048: done getting variables 25201 1726882705.35093: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:25 -0400 (0:00:00.047) 0:00:26.525 ****** 25201 1726882705.35117: entering _queue_task() for managed_node2/service 25201 1726882705.35293: worker is 1 (out of 1 available) 25201 1726882705.35305: exiting _queue_task() for managed_node2/service 25201 1726882705.35315: done queuing things up, now waiting for results queue to drain 25201 1726882705.35317: waiting for pending results... 25201 1726882705.35485: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25201 1726882705.35572: in run() - task 0e448fcc-3ce9-313b-197e-000000000079 25201 1726882705.35583: variable 'ansible_search_path' from source: unknown 25201 1726882705.35586: variable 'ansible_search_path' from source: unknown 25201 1726882705.35618: calling self._execute() 25201 1726882705.35683: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.35686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.35694: variable 'omit' from source: magic vars 25201 1726882705.35945: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.35954: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.36034: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.36167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882705.37697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882705.37741: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882705.37770: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882705.37797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882705.37817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882705.37872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.37902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.37921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.37949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.37959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.37992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.38008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.38026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.38052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.38062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.38095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.38112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.38130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.38155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.38169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.38276: variable 'network_connections' from source: task vars 25201 1726882705.38286: variable 'interface' from source: play vars 25201 1726882705.38331: variable 'interface' from source: play vars 25201 1726882705.38383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882705.38488: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882705.38513: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882705.38535: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882705.38556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882705.38590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882705.38605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882705.38621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.38638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882705.38678: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882705.38827: variable 'network_connections' from source: task vars 25201 1726882705.38830: variable 'interface' from source: play vars 25201 1726882705.38875: variable 'interface' from source: play vars 25201 1726882705.38895: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25201 1726882705.38900: when evaluation is False, skipping this task 25201 1726882705.38902: _execute() done 25201 1726882705.38905: dumping result to json 25201 1726882705.38907: done dumping result, returning 25201 1726882705.38909: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-313b-197e-000000000079] 25201 1726882705.38920: sending task result for task 0e448fcc-3ce9-313b-197e-000000000079 25201 1726882705.39009: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000079 25201 1726882705.39018: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25201 1726882705.39058: no more pending results, returning what we have 25201 1726882705.39061: results queue empty 25201 1726882705.39062: checking for any_errors_fatal 25201 1726882705.39070: done checking for any_errors_fatal 25201 1726882705.39071: checking for max_fail_percentage 25201 1726882705.39073: done checking for max_fail_percentage 25201 1726882705.39074: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.39075: done checking to see if all hosts have failed 25201 1726882705.39075: getting the remaining hosts for this loop 25201 1726882705.39077: done getting the remaining hosts for this loop 25201 1726882705.39080: getting the next task for host managed_node2 25201 1726882705.39085: done getting next task for host managed_node2 25201 1726882705.39088: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25201 1726882705.39091: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.39112: getting variables 25201 1726882705.39114: in VariableManager get_vars() 25201 1726882705.39146: Calling all_inventory to load vars for managed_node2 25201 1726882705.39148: Calling groups_inventory to load vars for managed_node2 25201 1726882705.39150: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.39157: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.39159: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.39161: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.40039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.40959: done with get_vars() 25201 1726882705.40977: done getting variables 25201 1726882705.41014: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:25 -0400 (0:00:00.059) 0:00:26.584 ****** 25201 1726882705.41034: entering _queue_task() for managed_node2/service 25201 1726882705.41214: worker is 1 (out of 1 available) 25201 1726882705.41227: exiting _queue_task() for managed_node2/service 25201 1726882705.41238: done queuing things up, now waiting for results queue to drain 25201 1726882705.41240: waiting for pending results... 25201 1726882705.41406: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25201 1726882705.41495: in run() - task 0e448fcc-3ce9-313b-197e-00000000007a 25201 1726882705.41505: variable 'ansible_search_path' from source: unknown 25201 1726882705.41508: variable 'ansible_search_path' from source: unknown 25201 1726882705.41535: calling self._execute() 25201 1726882705.41607: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.41611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.41619: variable 'omit' from source: magic vars 25201 1726882705.41887: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.41896: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.42004: variable 'network_provider' from source: set_fact 25201 1726882705.42008: variable 'network_state' from source: role '' defaults 25201 1726882705.42018: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25201 1726882705.42025: variable 'omit' from source: magic vars 25201 1726882705.42065: variable 'omit' from source: magic vars 25201 1726882705.42086: variable 'network_service_name' from source: role '' defaults 25201 1726882705.42133: variable 'network_service_name' from source: role '' defaults 25201 1726882705.42207: variable '__network_provider_setup' from source: role '' defaults 25201 1726882705.42211: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882705.42258: variable '__network_service_name_default_nm' from source: role '' defaults 25201 1726882705.42266: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882705.42311: variable '__network_packages_default_nm' from source: role '' defaults 25201 1726882705.42455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882705.43952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882705.44000: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882705.44026: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882705.44058: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882705.44082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882705.44136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.44155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.44177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.44206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.44217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.44246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.44262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.44283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.44313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.44322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.44465: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25201 1726882705.44538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.44555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.44575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.44600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.44611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.44677: variable 'ansible_python' from source: facts 25201 1726882705.44693: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25201 1726882705.44751: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882705.44807: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882705.44891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.44908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.44924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.44953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.44961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.44999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882705.45018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882705.45035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.45069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882705.45081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882705.45172: variable 'network_connections' from source: task vars 25201 1726882705.45181: variable 'interface' from source: play vars 25201 1726882705.45230: variable 'interface' from source: play vars 25201 1726882705.45303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882705.45415: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882705.45448: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882705.45480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882705.45521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882705.45561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882705.45585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882705.45611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882705.45632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882705.45669: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.45841: variable 'network_connections' from source: task vars 25201 1726882705.45847: variable 'interface' from source: play vars 25201 1726882705.45900: variable 'interface' from source: play vars 25201 1726882705.45925: variable '__network_packages_default_wireless' from source: role '' defaults 25201 1726882705.45981: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882705.46160: variable 'network_connections' from source: task vars 25201 1726882705.46166: variable 'interface' from source: play vars 25201 1726882705.46214: variable 'interface' from source: play vars 25201 1726882705.46229: variable '__network_packages_default_team' from source: role '' defaults 25201 1726882705.46287: variable '__network_team_connections_defined' from source: role '' defaults 25201 1726882705.46467: variable 'network_connections' from source: task vars 25201 1726882705.46477: variable 'interface' from source: play vars 25201 1726882705.46520: variable 'interface' from source: play vars 25201 1726882705.46555: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882705.46601: variable '__network_service_name_default_initscripts' from source: role '' defaults 25201 1726882705.46607: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882705.46648: variable '__network_packages_default_initscripts' from source: role '' defaults 25201 1726882705.46788: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25201 1726882705.47090: variable 'network_connections' from source: task vars 25201 1726882705.47094: variable 'interface' from source: play vars 25201 1726882705.47137: variable 'interface' from source: play vars 25201 1726882705.47144: variable 'ansible_distribution' from source: facts 25201 1726882705.47147: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.47152: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.47163: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25201 1726882705.47276: variable 'ansible_distribution' from source: facts 25201 1726882705.47280: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.47284: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.47294: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25201 1726882705.47405: variable 'ansible_distribution' from source: facts 25201 1726882705.47408: variable '__network_rh_distros' from source: role '' defaults 25201 1726882705.47413: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.47437: variable 'network_provider' from source: set_fact 25201 1726882705.47456: variable 'omit' from source: magic vars 25201 1726882705.47474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882705.47494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882705.47507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882705.47520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882705.47528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882705.47549: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882705.47552: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.47554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.47622: Set connection var ansible_shell_executable to /bin/sh 25201 1726882705.47625: Set connection var ansible_pipelining to False 25201 1726882705.47631: Set connection var ansible_connection to ssh 25201 1726882705.47636: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882705.47638: Set connection var ansible_shell_type to sh 25201 1726882705.47644: Set connection var ansible_timeout to 10 25201 1726882705.47660: variable 'ansible_shell_executable' from source: unknown 25201 1726882705.47671: variable 'ansible_connection' from source: unknown 25201 1726882705.47675: variable 'ansible_module_compression' from source: unknown 25201 1726882705.47679: variable 'ansible_shell_type' from source: unknown 25201 1726882705.47681: variable 'ansible_shell_executable' from source: unknown 25201 1726882705.47683: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.47685: variable 'ansible_pipelining' from source: unknown 25201 1726882705.47687: variable 'ansible_timeout' from source: unknown 25201 1726882705.47689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.47748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882705.47756: variable 'omit' from source: magic vars 25201 1726882705.47762: starting attempt loop 25201 1726882705.47768: running the handler 25201 1726882705.47822: variable 'ansible_facts' from source: unknown 25201 1726882705.48288: _low_level_execute_command(): starting 25201 1726882705.48293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882705.48793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.48801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.48831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.48843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882705.48853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.48902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882705.48913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.49032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.50709: stdout chunk (state=3): >>>/root <<< 25201 1726882705.50804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882705.50852: stderr chunk (state=3): >>><<< 25201 1726882705.50855: stdout chunk (state=3): >>><<< 25201 1726882705.50874: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882705.50883: _low_level_execute_command(): starting 25201 1726882705.50889: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120 `" && echo ansible-tmp-1726882705.5087364-26341-100328425297120="` echo /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120 `" ) && sleep 0' 25201 1726882705.51324: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.51329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.51370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882705.51383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.51426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882705.51438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.51545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.53400: stdout chunk (state=3): >>>ansible-tmp-1726882705.5087364-26341-100328425297120=/root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120 <<< 25201 1726882705.53509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882705.53550: stderr chunk (state=3): >>><<< 25201 1726882705.53553: stdout chunk (state=3): >>><<< 25201 1726882705.53568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882705.5087364-26341-100328425297120=/root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882705.53590: variable 'ansible_module_compression' from source: unknown 25201 1726882705.53631: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 25201 1726882705.53680: variable 'ansible_facts' from source: unknown 25201 1726882705.53818: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/AnsiballZ_systemd.py 25201 1726882705.53922: Sending initial data 25201 1726882705.53925: Sent initial data (156 bytes) 25201 1726882705.54574: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.54588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.54612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.54628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.54681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882705.54693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.54800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.56513: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882705.56604: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882705.56703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpsce5qmuv /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/AnsiballZ_systemd.py <<< 25201 1726882705.56806: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882705.59091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882705.59157: stderr chunk (state=3): >>><<< 25201 1726882705.59160: stdout chunk (state=3): >>><<< 25201 1726882705.59180: done transferring module to remote 25201 1726882705.59194: _low_level_execute_command(): starting 25201 1726882705.59197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/ /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/AnsiballZ_systemd.py && sleep 0' 25201 1726882705.59678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.59768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882705.59783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.60234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.62004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882705.62047: stderr chunk (state=3): >>><<< 25201 1726882705.62050: stdout chunk (state=3): >>><<< 25201 1726882705.62061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882705.62066: _low_level_execute_command(): starting 25201 1726882705.62073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/AnsiballZ_systemd.py && sleep 0' 25201 1726882705.62501: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882705.62511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882705.62519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.62529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.62558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882705.62567: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882705.62577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.62588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882705.62595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882705.62600: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882705.62608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.62616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.62623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882705.62625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.62680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882705.62701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882705.62708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.62824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.87794: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 25201 1726882705.87831: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "1787822000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 25201 1726882705.87839: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25201 1726882705.89349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882705.89418: stderr chunk (state=3): >>><<< 25201 1726882705.89421: stdout chunk (state=3): >>><<< 25201 1726882705.89436: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9170944", "MemoryAvailable": "infinity", "CPUUsageNSec": "1787822000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882705.89620: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882705.89671: _low_level_execute_command(): starting 25201 1726882705.89681: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882705.5087364-26341-100328425297120/ > /dev/null 2>&1 && sleep 0' 25201 1726882705.90719: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.90723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882705.90754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.90758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882705.90760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882705.90839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882705.90862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882705.90992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882705.92846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882705.92849: stdout chunk (state=3): >>><<< 25201 1726882705.92851: stderr chunk (state=3): >>><<< 25201 1726882705.93072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882705.93076: handler run complete 25201 1726882705.93078: attempt loop complete, returning result 25201 1726882705.93081: _execute() done 25201 1726882705.93083: dumping result to json 25201 1726882705.93085: done dumping result, returning 25201 1726882705.93087: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-313b-197e-00000000007a] 25201 1726882705.93089: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007a ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882705.93315: no more pending results, returning what we have 25201 1726882705.93318: results queue empty 25201 1726882705.93319: checking for any_errors_fatal 25201 1726882705.93325: done checking for any_errors_fatal 25201 1726882705.93326: checking for max_fail_percentage 25201 1726882705.93328: done checking for max_fail_percentage 25201 1726882705.93328: checking to see if all hosts have failed and the running result is not ok 25201 1726882705.93329: done checking to see if all hosts have failed 25201 1726882705.93330: getting the remaining hosts for this loop 25201 1726882705.93332: done getting the remaining hosts for this loop 25201 1726882705.93336: getting the next task for host managed_node2 25201 1726882705.93342: done getting next task for host managed_node2 25201 1726882705.93346: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25201 1726882705.93349: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882705.93359: getting variables 25201 1726882705.93361: in VariableManager get_vars() 25201 1726882705.93429: Calling all_inventory to load vars for managed_node2 25201 1726882705.93433: Calling groups_inventory to load vars for managed_node2 25201 1726882705.93435: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882705.93446: Calling all_plugins_play to load vars for managed_node2 25201 1726882705.93449: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882705.93452: Calling groups_plugins_play to load vars for managed_node2 25201 1726882705.94433: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007a 25201 1726882705.94436: WORKER PROCESS EXITING 25201 1726882705.95478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882705.96494: done with get_vars() 25201 1726882705.96511: done getting variables 25201 1726882705.96553: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:25 -0400 (0:00:00.555) 0:00:27.140 ****** 25201 1726882705.96600: entering _queue_task() for managed_node2/service 25201 1726882705.96878: worker is 1 (out of 1 available) 25201 1726882705.96891: exiting _queue_task() for managed_node2/service 25201 1726882705.96905: done queuing things up, now waiting for results queue to drain 25201 1726882705.96908: waiting for pending results... 25201 1726882705.97210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25201 1726882705.97356: in run() - task 0e448fcc-3ce9-313b-197e-00000000007b 25201 1726882705.97384: variable 'ansible_search_path' from source: unknown 25201 1726882705.97393: variable 'ansible_search_path' from source: unknown 25201 1726882705.97429: calling self._execute() 25201 1726882705.97532: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882705.97548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882705.97566: variable 'omit' from source: magic vars 25201 1726882705.97951: variable 'ansible_distribution_major_version' from source: facts 25201 1726882705.97976: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882705.98112: variable 'network_provider' from source: set_fact 25201 1726882705.98128: Evaluated conditional (network_provider == "nm"): True 25201 1726882705.98223: variable '__network_wpa_supplicant_required' from source: role '' defaults 25201 1726882705.98323: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25201 1726882705.98491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882706.01092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882706.01154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882706.01199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882706.01238: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882706.01271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882706.01349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882706.01388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882706.01419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882706.01461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882706.01486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882706.01536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882706.01562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882706.01595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882706.01639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882706.01658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882706.01705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882706.01737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882706.01771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882706.01815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882706.01840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882706.01997: variable 'network_connections' from source: task vars 25201 1726882706.02016: variable 'interface' from source: play vars 25201 1726882706.02096: variable 'interface' from source: play vars 25201 1726882706.02176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882706.02353: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882706.02399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882706.02437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882706.02476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882706.02524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25201 1726882706.02551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25201 1726882706.02586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882706.02619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25201 1726882706.02687: variable '__network_wireless_connections_defined' from source: role '' defaults 25201 1726882706.02931: variable 'network_connections' from source: task vars 25201 1726882706.02942: variable 'interface' from source: play vars 25201 1726882706.03009: variable 'interface' from source: play vars 25201 1726882706.03046: Evaluated conditional (__network_wpa_supplicant_required): False 25201 1726882706.03053: when evaluation is False, skipping this task 25201 1726882706.03060: _execute() done 25201 1726882706.03070: dumping result to json 25201 1726882706.03079: done dumping result, returning 25201 1726882706.03089: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-313b-197e-00000000007b] 25201 1726882706.03106: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007b 25201 1726882706.03218: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007b 25201 1726882706.03225: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25201 1726882706.03294: no more pending results, returning what we have 25201 1726882706.03298: results queue empty 25201 1726882706.03299: checking for any_errors_fatal 25201 1726882706.03317: done checking for any_errors_fatal 25201 1726882706.03318: checking for max_fail_percentage 25201 1726882706.03320: done checking for max_fail_percentage 25201 1726882706.03321: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.03322: done checking to see if all hosts have failed 25201 1726882706.03322: getting the remaining hosts for this loop 25201 1726882706.03324: done getting the remaining hosts for this loop 25201 1726882706.03328: getting the next task for host managed_node2 25201 1726882706.03337: done getting next task for host managed_node2 25201 1726882706.03341: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25201 1726882706.03345: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.03367: getting variables 25201 1726882706.03369: in VariableManager get_vars() 25201 1726882706.03410: Calling all_inventory to load vars for managed_node2 25201 1726882706.03414: Calling groups_inventory to load vars for managed_node2 25201 1726882706.03416: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.03428: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.03431: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.03434: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.05283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882706.06938: done with get_vars() 25201 1726882706.06961: done getting variables 25201 1726882706.07024: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:26 -0400 (0:00:00.104) 0:00:27.245 ****** 25201 1726882706.07059: entering _queue_task() for managed_node2/service 25201 1726882706.07353: worker is 1 (out of 1 available) 25201 1726882706.07369: exiting _queue_task() for managed_node2/service 25201 1726882706.07380: done queuing things up, now waiting for results queue to drain 25201 1726882706.07381: waiting for pending results... 25201 1726882706.07667: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 25201 1726882706.07809: in run() - task 0e448fcc-3ce9-313b-197e-00000000007c 25201 1726882706.07830: variable 'ansible_search_path' from source: unknown 25201 1726882706.07837: variable 'ansible_search_path' from source: unknown 25201 1726882706.07880: calling self._execute() 25201 1726882706.07977: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.07987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.07999: variable 'omit' from source: magic vars 25201 1726882706.08347: variable 'ansible_distribution_major_version' from source: facts 25201 1726882706.08369: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882706.08488: variable 'network_provider' from source: set_fact 25201 1726882706.08499: Evaluated conditional (network_provider == "initscripts"): False 25201 1726882706.08507: when evaluation is False, skipping this task 25201 1726882706.08513: _execute() done 25201 1726882706.08519: dumping result to json 25201 1726882706.08530: done dumping result, returning 25201 1726882706.08541: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-313b-197e-00000000007c] 25201 1726882706.08550: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007c skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25201 1726882706.08695: no more pending results, returning what we have 25201 1726882706.08699: results queue empty 25201 1726882706.08700: checking for any_errors_fatal 25201 1726882706.08708: done checking for any_errors_fatal 25201 1726882706.08709: checking for max_fail_percentage 25201 1726882706.08711: done checking for max_fail_percentage 25201 1726882706.08712: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.08712: done checking to see if all hosts have failed 25201 1726882706.08713: getting the remaining hosts for this loop 25201 1726882706.08715: done getting the remaining hosts for this loop 25201 1726882706.08718: getting the next task for host managed_node2 25201 1726882706.08725: done getting next task for host managed_node2 25201 1726882706.08728: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25201 1726882706.08732: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.08756: getting variables 25201 1726882706.08758: in VariableManager get_vars() 25201 1726882706.08798: Calling all_inventory to load vars for managed_node2 25201 1726882706.08801: Calling groups_inventory to load vars for managed_node2 25201 1726882706.08804: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.08816: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.08819: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.08823: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.09343: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007c 25201 1726882706.09347: WORKER PROCESS EXITING 25201 1726882706.10095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882706.11130: done with get_vars() 25201 1726882706.11145: done getting variables 25201 1726882706.11188: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:26 -0400 (0:00:00.041) 0:00:27.286 ****** 25201 1726882706.11213: entering _queue_task() for managed_node2/copy 25201 1726882706.11409: worker is 1 (out of 1 available) 25201 1726882706.11421: exiting _queue_task() for managed_node2/copy 25201 1726882706.11433: done queuing things up, now waiting for results queue to drain 25201 1726882706.11435: waiting for pending results... 25201 1726882706.11638: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25201 1726882706.11773: in run() - task 0e448fcc-3ce9-313b-197e-00000000007d 25201 1726882706.11783: variable 'ansible_search_path' from source: unknown 25201 1726882706.11786: variable 'ansible_search_path' from source: unknown 25201 1726882706.11816: calling self._execute() 25201 1726882706.11886: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.11891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.11900: variable 'omit' from source: magic vars 25201 1726882706.12168: variable 'ansible_distribution_major_version' from source: facts 25201 1726882706.12179: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882706.12258: variable 'network_provider' from source: set_fact 25201 1726882706.12262: Evaluated conditional (network_provider == "initscripts"): False 25201 1726882706.12267: when evaluation is False, skipping this task 25201 1726882706.12274: _execute() done 25201 1726882706.12277: dumping result to json 25201 1726882706.12279: done dumping result, returning 25201 1726882706.12289: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-313b-197e-00000000007d] 25201 1726882706.12295: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007d 25201 1726882706.12385: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007d 25201 1726882706.12388: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25201 1726882706.12546: no more pending results, returning what we have 25201 1726882706.12549: results queue empty 25201 1726882706.12550: checking for any_errors_fatal 25201 1726882706.12554: done checking for any_errors_fatal 25201 1726882706.12555: checking for max_fail_percentage 25201 1726882706.12556: done checking for max_fail_percentage 25201 1726882706.12557: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.12558: done checking to see if all hosts have failed 25201 1726882706.12559: getting the remaining hosts for this loop 25201 1726882706.12560: done getting the remaining hosts for this loop 25201 1726882706.12563: getting the next task for host managed_node2 25201 1726882706.12572: done getting next task for host managed_node2 25201 1726882706.12575: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25201 1726882706.12578: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.12592: getting variables 25201 1726882706.12594: in VariableManager get_vars() 25201 1726882706.12624: Calling all_inventory to load vars for managed_node2 25201 1726882706.12627: Calling groups_inventory to load vars for managed_node2 25201 1726882706.12629: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.12636: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.12638: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.12641: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.16576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882706.17483: done with get_vars() 25201 1726882706.17497: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:26 -0400 (0:00:00.063) 0:00:27.350 ****** 25201 1726882706.17550: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25201 1726882706.17762: worker is 1 (out of 1 available) 25201 1726882706.17779: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25201 1726882706.17791: done queuing things up, now waiting for results queue to drain 25201 1726882706.17792: waiting for pending results... 25201 1726882706.17996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25201 1726882706.18081: in run() - task 0e448fcc-3ce9-313b-197e-00000000007e 25201 1726882706.18090: variable 'ansible_search_path' from source: unknown 25201 1726882706.18094: variable 'ansible_search_path' from source: unknown 25201 1726882706.18124: calling self._execute() 25201 1726882706.18198: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.18203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.18211: variable 'omit' from source: magic vars 25201 1726882706.18485: variable 'ansible_distribution_major_version' from source: facts 25201 1726882706.18495: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882706.18501: variable 'omit' from source: magic vars 25201 1726882706.18537: variable 'omit' from source: magic vars 25201 1726882706.18650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882706.21595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882706.21645: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882706.21675: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882706.21701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882706.21720: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882706.21807: variable 'network_provider' from source: set_fact 25201 1726882706.22301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882706.22332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882706.22360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882706.22409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882706.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882706.22504: variable 'omit' from source: magic vars 25201 1726882706.22617: variable 'omit' from source: magic vars 25201 1726882706.22724: variable 'network_connections' from source: task vars 25201 1726882706.22739: variable 'interface' from source: play vars 25201 1726882706.22806: variable 'interface' from source: play vars 25201 1726882706.22950: variable 'omit' from source: magic vars 25201 1726882706.22967: variable '__lsr_ansible_managed' from source: task vars 25201 1726882706.23029: variable '__lsr_ansible_managed' from source: task vars 25201 1726882706.23427: Loaded config def from plugin (lookup/template) 25201 1726882706.23437: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25201 1726882706.23470: File lookup term: get_ansible_managed.j2 25201 1726882706.23479: variable 'ansible_search_path' from source: unknown 25201 1726882706.23489: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25201 1726882706.23507: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25201 1726882706.23526: variable 'ansible_search_path' from source: unknown 25201 1726882706.29689: variable 'ansible_managed' from source: unknown 25201 1726882706.29830: variable 'omit' from source: magic vars 25201 1726882706.29870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882706.29905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882706.29929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882706.29953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882706.29976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882706.30010: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882706.30019: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.30027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.30136: Set connection var ansible_shell_executable to /bin/sh 25201 1726882706.30147: Set connection var ansible_pipelining to False 25201 1726882706.30157: Set connection var ansible_connection to ssh 25201 1726882706.30173: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882706.30185: Set connection var ansible_shell_type to sh 25201 1726882706.30197: Set connection var ansible_timeout to 10 25201 1726882706.30224: variable 'ansible_shell_executable' from source: unknown 25201 1726882706.30233: variable 'ansible_connection' from source: unknown 25201 1726882706.30240: variable 'ansible_module_compression' from source: unknown 25201 1726882706.30247: variable 'ansible_shell_type' from source: unknown 25201 1726882706.30254: variable 'ansible_shell_executable' from source: unknown 25201 1726882706.30260: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.30274: variable 'ansible_pipelining' from source: unknown 25201 1726882706.30282: variable 'ansible_timeout' from source: unknown 25201 1726882706.30294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.30435: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882706.30462: variable 'omit' from source: magic vars 25201 1726882706.30495: starting attempt loop 25201 1726882706.30507: running the handler 25201 1726882706.30524: _low_level_execute_command(): starting 25201 1726882706.30536: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882706.31557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.31581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.31598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.31616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.31658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.31677: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882706.31692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.31712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882706.31723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882706.31734: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882706.31744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.31756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.31776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.31787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.31797: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882706.31809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.31893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.31915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.31933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.32070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.33722: stdout chunk (state=3): >>>/root <<< 25201 1726882706.33828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882706.33903: stderr chunk (state=3): >>><<< 25201 1726882706.33915: stdout chunk (state=3): >>><<< 25201 1726882706.34024: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882706.34030: _low_level_execute_command(): starting 25201 1726882706.34033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753 `" && echo ansible-tmp-1726882706.3394456-26379-47169523111753="` echo /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753 `" ) && sleep 0' 25201 1726882706.34610: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.34623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.34639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.34676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.34723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.34734: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882706.34747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.34767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882706.34779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882706.34789: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882706.34800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.34816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.34830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.34840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.34850: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882706.34862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.35054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.35080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.35098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.35227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.37097: stdout chunk (state=3): >>>ansible-tmp-1726882706.3394456-26379-47169523111753=/root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753 <<< 25201 1726882706.37286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882706.37290: stdout chunk (state=3): >>><<< 25201 1726882706.37293: stderr chunk (state=3): >>><<< 25201 1726882706.37573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882706.3394456-26379-47169523111753=/root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882706.37581: variable 'ansible_module_compression' from source: unknown 25201 1726882706.37584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 25201 1726882706.37586: variable 'ansible_facts' from source: unknown 25201 1726882706.37588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/AnsiballZ_network_connections.py 25201 1726882706.38075: Sending initial data 25201 1726882706.38079: Sent initial data (167 bytes) 25201 1726882706.39846: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.40003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.40018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.40038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.40083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.40099: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882706.40119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.40439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882706.40452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882706.40467: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882706.40485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.40499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.40515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.40530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.40542: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882706.40556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.40640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.40669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.40688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.40831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.42627: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 25201 1726882706.42630: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882706.42719: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882706.42820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpezhu2c1s /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/AnsiballZ_network_connections.py <<< 25201 1726882706.42917: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882706.44945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882706.45070: stderr chunk (state=3): >>><<< 25201 1726882706.45074: stdout chunk (state=3): >>><<< 25201 1726882706.45076: done transferring module to remote 25201 1726882706.45079: _low_level_execute_command(): starting 25201 1726882706.45151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/ /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/AnsiballZ_network_connections.py && sleep 0' 25201 1726882706.45717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.45731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.45745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.45768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.45814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.45829: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882706.45844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.45862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882706.45881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882706.45894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882706.45906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.45925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.45942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.45954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.45971: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882706.45987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.46069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.46088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.46104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.46232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.48044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882706.48104: stderr chunk (state=3): >>><<< 25201 1726882706.48107: stdout chunk (state=3): >>><<< 25201 1726882706.48191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882706.48198: _low_level_execute_command(): starting 25201 1726882706.48201: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/AnsiballZ_network_connections.py && sleep 0' 25201 1726882706.48740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.48756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.48772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.48788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.48824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.48834: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882706.48845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.48862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882706.48876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882706.48885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882706.48894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.48905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882706.48916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.48925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882706.48933: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882706.48942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.49020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.49035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.49047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.49194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.78083: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 25201 1726882706.78088: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_87oojnq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_87oojnq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/283a7ffe-9cfe-42e8-8b60-ec161f169c65: error=unknown <<< 25201 1726882706.78282: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25201 1726882706.79870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882706.79874: stdout chunk (state=3): >>><<< 25201 1726882706.79877: stderr chunk (state=3): >>><<< 25201 1726882706.80015: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_87oojnq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_87oojnq2/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/283a7ffe-9cfe-42e8-8b60-ec161f169c65: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882706.80018: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882706.80021: _low_level_execute_command(): starting 25201 1726882706.80023: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882706.3394456-26379-47169523111753/ > /dev/null 2>&1 && sleep 0' 25201 1726882706.81421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882706.81542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882706.81546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882706.81588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882706.81591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882706.81594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882706.81777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882706.81783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882706.81786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882706.81891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882706.83703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882706.83775: stderr chunk (state=3): >>><<< 25201 1726882706.83778: stdout chunk (state=3): >>><<< 25201 1726882706.84246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882706.84250: handler run complete 25201 1726882706.84252: attempt loop complete, returning result 25201 1726882706.84254: _execute() done 25201 1726882706.84257: dumping result to json 25201 1726882706.84259: done dumping result, returning 25201 1726882706.84261: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-313b-197e-00000000007e] 25201 1726882706.84268: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007e 25201 1726882706.84341: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007e 25201 1726882706.84345: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 25201 1726882706.84444: no more pending results, returning what we have 25201 1726882706.84447: results queue empty 25201 1726882706.84448: checking for any_errors_fatal 25201 1726882706.84454: done checking for any_errors_fatal 25201 1726882706.84455: checking for max_fail_percentage 25201 1726882706.84456: done checking for max_fail_percentage 25201 1726882706.84457: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.84458: done checking to see if all hosts have failed 25201 1726882706.84459: getting the remaining hosts for this loop 25201 1726882706.84460: done getting the remaining hosts for this loop 25201 1726882706.84465: getting the next task for host managed_node2 25201 1726882706.84471: done getting next task for host managed_node2 25201 1726882706.84475: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25201 1726882706.84478: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.84488: getting variables 25201 1726882706.84490: in VariableManager get_vars() 25201 1726882706.84526: Calling all_inventory to load vars for managed_node2 25201 1726882706.84529: Calling groups_inventory to load vars for managed_node2 25201 1726882706.84532: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.84540: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.84543: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.84546: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.87373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882706.89682: done with get_vars() 25201 1726882706.89707: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:26 -0400 (0:00:00.722) 0:00:28.072 ****** 25201 1726882706.89800: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25201 1726882706.90132: worker is 1 (out of 1 available) 25201 1726882706.90145: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25201 1726882706.90157: done queuing things up, now waiting for results queue to drain 25201 1726882706.90159: waiting for pending results... 25201 1726882706.90530: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 25201 1726882706.90694: in run() - task 0e448fcc-3ce9-313b-197e-00000000007f 25201 1726882706.90715: variable 'ansible_search_path' from source: unknown 25201 1726882706.90728: variable 'ansible_search_path' from source: unknown 25201 1726882706.90775: calling self._execute() 25201 1726882706.90886: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.90898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.90912: variable 'omit' from source: magic vars 25201 1726882706.91319: variable 'ansible_distribution_major_version' from source: facts 25201 1726882706.91337: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882706.91469: variable 'network_state' from source: role '' defaults 25201 1726882706.91489: Evaluated conditional (network_state != {}): False 25201 1726882706.91498: when evaluation is False, skipping this task 25201 1726882706.91505: _execute() done 25201 1726882706.91512: dumping result to json 25201 1726882706.91519: done dumping result, returning 25201 1726882706.91535: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-313b-197e-00000000007f] 25201 1726882706.91546: sending task result for task 0e448fcc-3ce9-313b-197e-00000000007f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25201 1726882706.91700: no more pending results, returning what we have 25201 1726882706.91704: results queue empty 25201 1726882706.91705: checking for any_errors_fatal 25201 1726882706.91716: done checking for any_errors_fatal 25201 1726882706.91717: checking for max_fail_percentage 25201 1726882706.91719: done checking for max_fail_percentage 25201 1726882706.91720: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.91721: done checking to see if all hosts have failed 25201 1726882706.91722: getting the remaining hosts for this loop 25201 1726882706.91724: done getting the remaining hosts for this loop 25201 1726882706.91728: getting the next task for host managed_node2 25201 1726882706.91737: done getting next task for host managed_node2 25201 1726882706.91741: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25201 1726882706.91744: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.91762: getting variables 25201 1726882706.91766: in VariableManager get_vars() 25201 1726882706.91808: Calling all_inventory to load vars for managed_node2 25201 1726882706.91811: Calling groups_inventory to load vars for managed_node2 25201 1726882706.91814: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.91827: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.91831: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.91834: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.93085: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000007f 25201 1726882706.93088: WORKER PROCESS EXITING 25201 1726882706.93866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882706.95697: done with get_vars() 25201 1726882706.95718: done getting variables 25201 1726882706.95779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:26 -0400 (0:00:00.060) 0:00:28.132 ****** 25201 1726882706.95813: entering _queue_task() for managed_node2/debug 25201 1726882706.96079: worker is 1 (out of 1 available) 25201 1726882706.96093: exiting _queue_task() for managed_node2/debug 25201 1726882706.96104: done queuing things up, now waiting for results queue to drain 25201 1726882706.96105: waiting for pending results... 25201 1726882706.96379: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25201 1726882706.96530: in run() - task 0e448fcc-3ce9-313b-197e-000000000080 25201 1726882706.96579: variable 'ansible_search_path' from source: unknown 25201 1726882706.96587: variable 'ansible_search_path' from source: unknown 25201 1726882706.96628: calling self._execute() 25201 1726882706.96780: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.96797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.96819: variable 'omit' from source: magic vars 25201 1726882706.97221: variable 'ansible_distribution_major_version' from source: facts 25201 1726882706.97240: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882706.97252: variable 'omit' from source: magic vars 25201 1726882706.97316: variable 'omit' from source: magic vars 25201 1726882706.97359: variable 'omit' from source: magic vars 25201 1726882706.97408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882706.97452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882706.97484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882706.97507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882706.97526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882706.97597: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882706.97606: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.97616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.97732: Set connection var ansible_shell_executable to /bin/sh 25201 1726882706.97743: Set connection var ansible_pipelining to False 25201 1726882706.97752: Set connection var ansible_connection to ssh 25201 1726882706.97762: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882706.97774: Set connection var ansible_shell_type to sh 25201 1726882706.97787: Set connection var ansible_timeout to 10 25201 1726882706.97816: variable 'ansible_shell_executable' from source: unknown 25201 1726882706.97823: variable 'ansible_connection' from source: unknown 25201 1726882706.97831: variable 'ansible_module_compression' from source: unknown 25201 1726882706.97837: variable 'ansible_shell_type' from source: unknown 25201 1726882706.97844: variable 'ansible_shell_executable' from source: unknown 25201 1726882706.97850: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882706.97857: variable 'ansible_pipelining' from source: unknown 25201 1726882706.97865: variable 'ansible_timeout' from source: unknown 25201 1726882706.97875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882706.98025: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882706.98042: variable 'omit' from source: magic vars 25201 1726882706.98051: starting attempt loop 25201 1726882706.98057: running the handler 25201 1726882706.98195: variable '__network_connections_result' from source: set_fact 25201 1726882706.98270: handler run complete 25201 1726882706.98294: attempt loop complete, returning result 25201 1726882706.98301: _execute() done 25201 1726882706.98308: dumping result to json 25201 1726882706.98321: done dumping result, returning 25201 1726882706.98336: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-313b-197e-000000000080] 25201 1726882706.98349: sending task result for task 0e448fcc-3ce9-313b-197e-000000000080 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 25201 1726882706.98508: no more pending results, returning what we have 25201 1726882706.98511: results queue empty 25201 1726882706.98512: checking for any_errors_fatal 25201 1726882706.98520: done checking for any_errors_fatal 25201 1726882706.98521: checking for max_fail_percentage 25201 1726882706.98522: done checking for max_fail_percentage 25201 1726882706.98523: checking to see if all hosts have failed and the running result is not ok 25201 1726882706.98524: done checking to see if all hosts have failed 25201 1726882706.98525: getting the remaining hosts for this loop 25201 1726882706.98527: done getting the remaining hosts for this loop 25201 1726882706.98531: getting the next task for host managed_node2 25201 1726882706.98538: done getting next task for host managed_node2 25201 1726882706.98542: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25201 1726882706.98545: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882706.98555: getting variables 25201 1726882706.98557: in VariableManager get_vars() 25201 1726882706.98598: Calling all_inventory to load vars for managed_node2 25201 1726882706.98601: Calling groups_inventory to load vars for managed_node2 25201 1726882706.98604: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882706.98614: Calling all_plugins_play to load vars for managed_node2 25201 1726882706.98617: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882706.98620: Calling groups_plugins_play to load vars for managed_node2 25201 1726882706.99621: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000080 25201 1726882706.99625: WORKER PROCESS EXITING 25201 1726882707.00621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.02378: done with get_vars() 25201 1726882707.02400: done getting variables 25201 1726882707.02455: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:27 -0400 (0:00:00.066) 0:00:28.199 ****** 25201 1726882707.02493: entering _queue_task() for managed_node2/debug 25201 1726882707.02787: worker is 1 (out of 1 available) 25201 1726882707.02805: exiting _queue_task() for managed_node2/debug 25201 1726882707.02835: done queuing things up, now waiting for results queue to drain 25201 1726882707.02845: waiting for pending results... 25201 1726882707.03219: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25201 1726882707.03365: in run() - task 0e448fcc-3ce9-313b-197e-000000000081 25201 1726882707.03385: variable 'ansible_search_path' from source: unknown 25201 1726882707.03393: variable 'ansible_search_path' from source: unknown 25201 1726882707.03434: calling self._execute() 25201 1726882707.03542: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.03555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.03575: variable 'omit' from source: magic vars 25201 1726882707.03977: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.03995: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.04013: variable 'omit' from source: magic vars 25201 1726882707.04074: variable 'omit' from source: magic vars 25201 1726882707.04115: variable 'omit' from source: magic vars 25201 1726882707.04160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882707.04205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882707.04234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882707.04258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882707.04281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882707.04317: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882707.04326: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.04336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.04458: Set connection var ansible_shell_executable to /bin/sh 25201 1726882707.04473: Set connection var ansible_pipelining to False 25201 1726882707.04509: Set connection var ansible_connection to ssh 25201 1726882707.04520: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882707.04528: Set connection var ansible_shell_type to sh 25201 1726882707.04540: Set connection var ansible_timeout to 10 25201 1726882707.04571: variable 'ansible_shell_executable' from source: unknown 25201 1726882707.04603: variable 'ansible_connection' from source: unknown 25201 1726882707.04611: variable 'ansible_module_compression' from source: unknown 25201 1726882707.04618: variable 'ansible_shell_type' from source: unknown 25201 1726882707.04625: variable 'ansible_shell_executable' from source: unknown 25201 1726882707.04632: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.04640: variable 'ansible_pipelining' from source: unknown 25201 1726882707.04647: variable 'ansible_timeout' from source: unknown 25201 1726882707.04660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.04939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882707.04955: variable 'omit' from source: magic vars 25201 1726882707.04966: starting attempt loop 25201 1726882707.04978: running the handler 25201 1726882707.05031: variable '__network_connections_result' from source: set_fact 25201 1726882707.05117: variable '__network_connections_result' from source: set_fact 25201 1726882707.05233: handler run complete 25201 1726882707.05262: attempt loop complete, returning result 25201 1726882707.05273: _execute() done 25201 1726882707.05279: dumping result to json 25201 1726882707.05287: done dumping result, returning 25201 1726882707.05303: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-313b-197e-000000000081] 25201 1726882707.05317: sending task result for task 0e448fcc-3ce9-313b-197e-000000000081 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 25201 1726882707.05500: no more pending results, returning what we have 25201 1726882707.05504: results queue empty 25201 1726882707.05505: checking for any_errors_fatal 25201 1726882707.05514: done checking for any_errors_fatal 25201 1726882707.05515: checking for max_fail_percentage 25201 1726882707.05517: done checking for max_fail_percentage 25201 1726882707.05518: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.05519: done checking to see if all hosts have failed 25201 1726882707.05520: getting the remaining hosts for this loop 25201 1726882707.05521: done getting the remaining hosts for this loop 25201 1726882707.05525: getting the next task for host managed_node2 25201 1726882707.05533: done getting next task for host managed_node2 25201 1726882707.05536: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25201 1726882707.05540: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.05550: getting variables 25201 1726882707.05552: in VariableManager get_vars() 25201 1726882707.05592: Calling all_inventory to load vars for managed_node2 25201 1726882707.05595: Calling groups_inventory to load vars for managed_node2 25201 1726882707.05598: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.05608: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.05612: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.05615: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.06604: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000081 25201 1726882707.06608: WORKER PROCESS EXITING 25201 1726882707.07468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.09517: done with get_vars() 25201 1726882707.09537: done getting variables 25201 1726882707.09707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:27 -0400 (0:00:00.072) 0:00:28.272 ****** 25201 1726882707.09742: entering _queue_task() for managed_node2/debug 25201 1726882707.10226: worker is 1 (out of 1 available) 25201 1726882707.10240: exiting _queue_task() for managed_node2/debug 25201 1726882707.10250: done queuing things up, now waiting for results queue to drain 25201 1726882707.10252: waiting for pending results... 25201 1726882707.11122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25201 1726882707.11263: in run() - task 0e448fcc-3ce9-313b-197e-000000000082 25201 1726882707.11289: variable 'ansible_search_path' from source: unknown 25201 1726882707.11296: variable 'ansible_search_path' from source: unknown 25201 1726882707.11339: calling self._execute() 25201 1726882707.11442: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.11454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.11471: variable 'omit' from source: magic vars 25201 1726882707.11860: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.11885: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.12015: variable 'network_state' from source: role '' defaults 25201 1726882707.12030: Evaluated conditional (network_state != {}): False 25201 1726882707.12042: when evaluation is False, skipping this task 25201 1726882707.12049: _execute() done 25201 1726882707.12055: dumping result to json 25201 1726882707.12062: done dumping result, returning 25201 1726882707.12075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-313b-197e-000000000082] 25201 1726882707.12090: sending task result for task 0e448fcc-3ce9-313b-197e-000000000082 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 25201 1726882707.12229: no more pending results, returning what we have 25201 1726882707.12233: results queue empty 25201 1726882707.12234: checking for any_errors_fatal 25201 1726882707.12243: done checking for any_errors_fatal 25201 1726882707.12244: checking for max_fail_percentage 25201 1726882707.12246: done checking for max_fail_percentage 25201 1726882707.12247: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.12248: done checking to see if all hosts have failed 25201 1726882707.12249: getting the remaining hosts for this loop 25201 1726882707.12251: done getting the remaining hosts for this loop 25201 1726882707.12254: getting the next task for host managed_node2 25201 1726882707.12262: done getting next task for host managed_node2 25201 1726882707.12267: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25201 1726882707.12271: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.12290: getting variables 25201 1726882707.12292: in VariableManager get_vars() 25201 1726882707.12331: Calling all_inventory to load vars for managed_node2 25201 1726882707.12334: Calling groups_inventory to load vars for managed_node2 25201 1726882707.12336: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.12348: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.12351: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.12354: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.13382: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000082 25201 1726882707.13385: WORKER PROCESS EXITING 25201 1726882707.13912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.15577: done with get_vars() 25201 1726882707.15597: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:27 -0400 (0:00:00.059) 0:00:28.331 ****** 25201 1726882707.15687: entering _queue_task() for managed_node2/ping 25201 1726882707.15913: worker is 1 (out of 1 available) 25201 1726882707.15925: exiting _queue_task() for managed_node2/ping 25201 1726882707.15936: done queuing things up, now waiting for results queue to drain 25201 1726882707.15937: waiting for pending results... 25201 1726882707.16197: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25201 1726882707.16332: in run() - task 0e448fcc-3ce9-313b-197e-000000000083 25201 1726882707.16352: variable 'ansible_search_path' from source: unknown 25201 1726882707.16358: variable 'ansible_search_path' from source: unknown 25201 1726882707.16396: calling self._execute() 25201 1726882707.16480: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.16493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.16505: variable 'omit' from source: magic vars 25201 1726882707.16888: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.16906: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.16923: variable 'omit' from source: magic vars 25201 1726882707.16984: variable 'omit' from source: magic vars 25201 1726882707.17024: variable 'omit' from source: magic vars 25201 1726882707.17073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882707.17114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882707.17141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882707.17166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882707.17184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882707.17218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882707.17227: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.17235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.17346: Set connection var ansible_shell_executable to /bin/sh 25201 1726882707.17361: Set connection var ansible_pipelining to False 25201 1726882707.17376: Set connection var ansible_connection to ssh 25201 1726882707.17386: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882707.17393: Set connection var ansible_shell_type to sh 25201 1726882707.17405: Set connection var ansible_timeout to 10 25201 1726882707.17432: variable 'ansible_shell_executable' from source: unknown 25201 1726882707.17440: variable 'ansible_connection' from source: unknown 25201 1726882707.17448: variable 'ansible_module_compression' from source: unknown 25201 1726882707.17455: variable 'ansible_shell_type' from source: unknown 25201 1726882707.17465: variable 'ansible_shell_executable' from source: unknown 25201 1726882707.17477: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.17486: variable 'ansible_pipelining' from source: unknown 25201 1726882707.17493: variable 'ansible_timeout' from source: unknown 25201 1726882707.17501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.17702: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 25201 1726882707.17720: variable 'omit' from source: magic vars 25201 1726882707.17728: starting attempt loop 25201 1726882707.17735: running the handler 25201 1726882707.17752: _low_level_execute_command(): starting 25201 1726882707.17766: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882707.18521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.18537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.18557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.18581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.18624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.18636: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.18648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.18674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.18686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.18699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.18712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.18725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.18739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.18751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.18765: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.18782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.18857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.18885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.18902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.19050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.20695: stdout chunk (state=3): >>>/root <<< 25201 1726882707.20794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882707.20848: stderr chunk (state=3): >>><<< 25201 1726882707.20857: stdout chunk (state=3): >>><<< 25201 1726882707.20899: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882707.20902: _low_level_execute_command(): starting 25201 1726882707.20909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221 `" && echo ansible-tmp-1726882707.2089171-26435-8892963654221="` echo /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221 `" ) && sleep 0' 25201 1726882707.21513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.21523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.21537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.21549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.21589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.21597: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.21606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.21618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.21625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.21633: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.21640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.21653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.21667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.21674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.21681: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.21690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.21760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.21782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.21794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.21922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.23789: stdout chunk (state=3): >>>ansible-tmp-1726882707.2089171-26435-8892963654221=/root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221 <<< 25201 1726882707.23897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882707.23942: stderr chunk (state=3): >>><<< 25201 1726882707.23946: stdout chunk (state=3): >>><<< 25201 1726882707.23958: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882707.2089171-26435-8892963654221=/root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882707.24005: variable 'ansible_module_compression' from source: unknown 25201 1726882707.24038: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 25201 1726882707.24096: variable 'ansible_facts' from source: unknown 25201 1726882707.24144: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/AnsiballZ_ping.py 25201 1726882707.24276: Sending initial data 25201 1726882707.24279: Sent initial data (151 bytes) 25201 1726882707.25159: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.25169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.25180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.25195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.25233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.25243: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.25248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.25261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.25273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.25280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.25288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.25297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.25309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.25317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.25323: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.25333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.25407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.25424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.25436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.25559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.27325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882707.27426: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882707.27527: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmp0dp_eazb /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/AnsiballZ_ping.py <<< 25201 1726882707.27629: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882707.28871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882707.29071: stderr chunk (state=3): >>><<< 25201 1726882707.29075: stdout chunk (state=3): >>><<< 25201 1726882707.29077: done transferring module to remote 25201 1726882707.29084: _low_level_execute_command(): starting 25201 1726882707.29086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/ /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/AnsiballZ_ping.py && sleep 0' 25201 1726882707.29679: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.29694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.29709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.29731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.29773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.29787: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.29801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.29820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.29832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.29845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.29855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.29869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.29885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.29895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.29905: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.29918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.29989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.30005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.30020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.30145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.31950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882707.31956: stdout chunk (state=3): >>><<< 25201 1726882707.31965: stderr chunk (state=3): >>><<< 25201 1726882707.31979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882707.31983: _low_level_execute_command(): starting 25201 1726882707.31987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/AnsiballZ_ping.py && sleep 0' 25201 1726882707.32572: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.32582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.32592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.32605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.32642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.32649: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.32658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.32678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.32685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.32691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.32699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.32708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.32720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.32728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.32731: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.32740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.32813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.32823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.32841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.32978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.45749: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25201 1726882707.46744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882707.46800: stderr chunk (state=3): >>><<< 25201 1726882707.46803: stdout chunk (state=3): >>><<< 25201 1726882707.46817: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882707.46839: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882707.46846: _low_level_execute_command(): starting 25201 1726882707.46850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882707.2089171-26435-8892963654221/ > /dev/null 2>&1 && sleep 0' 25201 1726882707.47525: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882707.47542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.47554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.47575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.47613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.47620: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882707.47630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.47651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882707.47658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882707.47669: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882707.47677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882707.47690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882707.47701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882707.47708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882707.47714: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882707.47724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882707.47804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882707.47820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882707.47831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882707.47953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882707.49748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882707.49797: stderr chunk (state=3): >>><<< 25201 1726882707.49802: stdout chunk (state=3): >>><<< 25201 1726882707.49814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882707.49819: handler run complete 25201 1726882707.49831: attempt loop complete, returning result 25201 1726882707.49833: _execute() done 25201 1726882707.49835: dumping result to json 25201 1726882707.49838: done dumping result, returning 25201 1726882707.49846: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-313b-197e-000000000083] 25201 1726882707.49851: sending task result for task 0e448fcc-3ce9-313b-197e-000000000083 25201 1726882707.49938: done sending task result for task 0e448fcc-3ce9-313b-197e-000000000083 25201 1726882707.49941: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 25201 1726882707.50011: no more pending results, returning what we have 25201 1726882707.50015: results queue empty 25201 1726882707.50015: checking for any_errors_fatal 25201 1726882707.50022: done checking for any_errors_fatal 25201 1726882707.50022: checking for max_fail_percentage 25201 1726882707.50024: done checking for max_fail_percentage 25201 1726882707.50025: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.50025: done checking to see if all hosts have failed 25201 1726882707.50026: getting the remaining hosts for this loop 25201 1726882707.50028: done getting the remaining hosts for this loop 25201 1726882707.50031: getting the next task for host managed_node2 25201 1726882707.50041: done getting next task for host managed_node2 25201 1726882707.50043: ^ task is: TASK: meta (role_complete) 25201 1726882707.50046: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.50058: getting variables 25201 1726882707.50060: in VariableManager get_vars() 25201 1726882707.50101: Calling all_inventory to load vars for managed_node2 25201 1726882707.50104: Calling groups_inventory to load vars for managed_node2 25201 1726882707.50106: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.50115: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.50117: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.50120: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.51079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.51993: done with get_vars() 25201 1726882707.52008: done getting variables 25201 1726882707.52085: done queuing things up, now waiting for results queue to drain 25201 1726882707.52086: results queue empty 25201 1726882707.52087: checking for any_errors_fatal 25201 1726882707.52088: done checking for any_errors_fatal 25201 1726882707.52089: checking for max_fail_percentage 25201 1726882707.52090: done checking for max_fail_percentage 25201 1726882707.52090: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.52090: done checking to see if all hosts have failed 25201 1726882707.52093: getting the remaining hosts for this loop 25201 1726882707.52094: done getting the remaining hosts for this loop 25201 1726882707.52095: getting the next task for host managed_node2 25201 1726882707.52098: done getting next task for host managed_node2 25201 1726882707.52099: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25201 1726882707.52100: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882707.52102: getting variables 25201 1726882707.52102: in VariableManager get_vars() 25201 1726882707.52111: Calling all_inventory to load vars for managed_node2 25201 1726882707.52113: Calling groups_inventory to load vars for managed_node2 25201 1726882707.52114: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.52118: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.52120: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.52122: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.53237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.54710: done with get_vars() 25201 1726882707.54724: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Friday 20 September 2024 21:38:27 -0400 (0:00:00.390) 0:00:28.722 ****** 25201 1726882707.54774: entering _queue_task() for managed_node2/include_tasks 25201 1726882707.54978: worker is 1 (out of 1 available) 25201 1726882707.54994: exiting _queue_task() for managed_node2/include_tasks 25201 1726882707.55005: done queuing things up, now waiting for results queue to drain 25201 1726882707.55007: waiting for pending results... 25201 1726882707.55174: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 25201 1726882707.55240: in run() - task 0e448fcc-3ce9-313b-197e-0000000000b3 25201 1726882707.55251: variable 'ansible_search_path' from source: unknown 25201 1726882707.55284: calling self._execute() 25201 1726882707.55352: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.55355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.55366: variable 'omit' from source: magic vars 25201 1726882707.55642: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.55652: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.55658: _execute() done 25201 1726882707.55662: dumping result to json 25201 1726882707.55665: done dumping result, returning 25201 1726882707.55675: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-313b-197e-0000000000b3] 25201 1726882707.55681: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b3 25201 1726882707.55772: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b3 25201 1726882707.55776: WORKER PROCESS EXITING 25201 1726882707.55807: no more pending results, returning what we have 25201 1726882707.55811: in VariableManager get_vars() 25201 1726882707.55851: Calling all_inventory to load vars for managed_node2 25201 1726882707.55854: Calling groups_inventory to load vars for managed_node2 25201 1726882707.55855: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.55867: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.55870: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.55874: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.57200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.59305: done with get_vars() 25201 1726882707.59330: variable 'ansible_search_path' from source: unknown 25201 1726882707.59346: we have included files to process 25201 1726882707.59347: generating all_blocks data 25201 1726882707.59348: done generating all_blocks data 25201 1726882707.59352: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882707.59352: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882707.59354: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25201 1726882707.59637: in VariableManager get_vars() 25201 1726882707.59683: done with get_vars() 25201 1726882707.60380: done processing included file 25201 1726882707.60382: iterating over new_blocks loaded from include file 25201 1726882707.60383: in VariableManager get_vars() 25201 1726882707.60400: done with get_vars() 25201 1726882707.60402: filtering new block on tags 25201 1726882707.60433: done filtering new block on tags 25201 1726882707.60436: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 25201 1726882707.60441: extending task lists for all hosts with included blocks 25201 1726882707.63154: done extending task lists 25201 1726882707.63155: done processing included files 25201 1726882707.63156: results queue empty 25201 1726882707.63157: checking for any_errors_fatal 25201 1726882707.63159: done checking for any_errors_fatal 25201 1726882707.63159: checking for max_fail_percentage 25201 1726882707.63160: done checking for max_fail_percentage 25201 1726882707.63161: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.63162: done checking to see if all hosts have failed 25201 1726882707.63163: getting the remaining hosts for this loop 25201 1726882707.63166: done getting the remaining hosts for this loop 25201 1726882707.63168: getting the next task for host managed_node2 25201 1726882707.63172: done getting next task for host managed_node2 25201 1726882707.63174: ^ task is: TASK: Ensure state in ["present", "absent"] 25201 1726882707.63176: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.63179: getting variables 25201 1726882707.63180: in VariableManager get_vars() 25201 1726882707.63197: Calling all_inventory to load vars for managed_node2 25201 1726882707.63199: Calling groups_inventory to load vars for managed_node2 25201 1726882707.63201: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.63208: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.63210: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.63213: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.64629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.66412: done with get_vars() 25201 1726882707.66434: done getting variables 25201 1726882707.66479: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:38:27 -0400 (0:00:00.117) 0:00:28.839 ****** 25201 1726882707.66507: entering _queue_task() for managed_node2/fail 25201 1726882707.66887: worker is 1 (out of 1 available) 25201 1726882707.66900: exiting _queue_task() for managed_node2/fail 25201 1726882707.66912: done queuing things up, now waiting for results queue to drain 25201 1726882707.66913: waiting for pending results... 25201 1726882707.67797: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 25201 1726882707.68034: in run() - task 0e448fcc-3ce9-313b-197e-0000000005cc 25201 1726882707.68051: variable 'ansible_search_path' from source: unknown 25201 1726882707.68057: variable 'ansible_search_path' from source: unknown 25201 1726882707.68102: calling self._execute() 25201 1726882707.68197: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.68211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.68226: variable 'omit' from source: magic vars 25201 1726882707.69615: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.69644: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.69951: variable 'state' from source: include params 25201 1726882707.69962: Evaluated conditional (state not in ["present", "absent"]): False 25201 1726882707.69972: when evaluation is False, skipping this task 25201 1726882707.69979: _execute() done 25201 1726882707.70032: dumping result to json 25201 1726882707.70099: done dumping result, returning 25201 1726882707.70172: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-313b-197e-0000000005cc] 25201 1726882707.70206: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cc skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25201 1726882707.70480: no more pending results, returning what we have 25201 1726882707.70485: results queue empty 25201 1726882707.70487: checking for any_errors_fatal 25201 1726882707.70488: done checking for any_errors_fatal 25201 1726882707.70489: checking for max_fail_percentage 25201 1726882707.70498: done checking for max_fail_percentage 25201 1726882707.70500: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.70501: done checking to see if all hosts have failed 25201 1726882707.70502: getting the remaining hosts for this loop 25201 1726882707.70504: done getting the remaining hosts for this loop 25201 1726882707.70509: getting the next task for host managed_node2 25201 1726882707.70524: done getting next task for host managed_node2 25201 1726882707.70526: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25201 1726882707.70538: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.70548: getting variables 25201 1726882707.70551: in VariableManager get_vars() 25201 1726882707.70640: Calling all_inventory to load vars for managed_node2 25201 1726882707.70645: Calling groups_inventory to load vars for managed_node2 25201 1726882707.70648: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.70676: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.70682: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.70697: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.71762: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cc 25201 1726882707.71767: WORKER PROCESS EXITING 25201 1726882707.72537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.74458: done with get_vars() 25201 1726882707.74506: done getting variables 25201 1726882707.74588: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:38:27 -0400 (0:00:00.081) 0:00:28.920 ****** 25201 1726882707.74621: entering _queue_task() for managed_node2/fail 25201 1726882707.75023: worker is 1 (out of 1 available) 25201 1726882707.75036: exiting _queue_task() for managed_node2/fail 25201 1726882707.75061: done queuing things up, now waiting for results queue to drain 25201 1726882707.75067: waiting for pending results... 25201 1726882707.75445: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 25201 1726882707.75569: in run() - task 0e448fcc-3ce9-313b-197e-0000000005cd 25201 1726882707.75588: variable 'ansible_search_path' from source: unknown 25201 1726882707.75596: variable 'ansible_search_path' from source: unknown 25201 1726882707.75649: calling self._execute() 25201 1726882707.75799: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.75809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.75839: variable 'omit' from source: magic vars 25201 1726882707.76295: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.76312: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.76482: variable 'type' from source: play vars 25201 1726882707.76493: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25201 1726882707.76503: when evaluation is False, skipping this task 25201 1726882707.76510: _execute() done 25201 1726882707.76516: dumping result to json 25201 1726882707.76523: done dumping result, returning 25201 1726882707.76531: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-313b-197e-0000000005cd] 25201 1726882707.76540: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cd skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25201 1726882707.76697: no more pending results, returning what we have 25201 1726882707.76701: results queue empty 25201 1726882707.76702: checking for any_errors_fatal 25201 1726882707.76707: done checking for any_errors_fatal 25201 1726882707.76708: checking for max_fail_percentage 25201 1726882707.76710: done checking for max_fail_percentage 25201 1726882707.76711: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.76712: done checking to see if all hosts have failed 25201 1726882707.76713: getting the remaining hosts for this loop 25201 1726882707.76714: done getting the remaining hosts for this loop 25201 1726882707.76718: getting the next task for host managed_node2 25201 1726882707.76727: done getting next task for host managed_node2 25201 1726882707.76730: ^ task is: TASK: Include the task 'show_interfaces.yml' 25201 1726882707.76733: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.76738: getting variables 25201 1726882707.76740: in VariableManager get_vars() 25201 1726882707.76784: Calling all_inventory to load vars for managed_node2 25201 1726882707.76787: Calling groups_inventory to load vars for managed_node2 25201 1726882707.76790: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.76803: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.76806: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.76809: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.78341: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cd 25201 1726882707.78344: WORKER PROCESS EXITING 25201 1726882707.78822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.80744: done with get_vars() 25201 1726882707.80783: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:38:27 -0400 (0:00:00.062) 0:00:28.983 ****** 25201 1726882707.80900: entering _queue_task() for managed_node2/include_tasks 25201 1726882707.81293: worker is 1 (out of 1 available) 25201 1726882707.81306: exiting _queue_task() for managed_node2/include_tasks 25201 1726882707.81318: done queuing things up, now waiting for results queue to drain 25201 1726882707.81320: waiting for pending results... 25201 1726882707.81684: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25201 1726882707.81819: in run() - task 0e448fcc-3ce9-313b-197e-0000000005ce 25201 1726882707.81853: variable 'ansible_search_path' from source: unknown 25201 1726882707.81864: variable 'ansible_search_path' from source: unknown 25201 1726882707.81921: calling self._execute() 25201 1726882707.82018: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.82030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.82054: variable 'omit' from source: magic vars 25201 1726882707.82558: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.82585: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.82606: _execute() done 25201 1726882707.82631: dumping result to json 25201 1726882707.82648: done dumping result, returning 25201 1726882707.82659: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-313b-197e-0000000005ce] 25201 1726882707.82695: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005ce 25201 1726882707.82897: no more pending results, returning what we have 25201 1726882707.82904: in VariableManager get_vars() 25201 1726882707.82997: Calling all_inventory to load vars for managed_node2 25201 1726882707.83001: Calling groups_inventory to load vars for managed_node2 25201 1726882707.83004: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.83018: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.83022: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.83025: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.84302: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005ce 25201 1726882707.84305: WORKER PROCESS EXITING 25201 1726882707.84943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.87056: done with get_vars() 25201 1726882707.87080: variable 'ansible_search_path' from source: unknown 25201 1726882707.87082: variable 'ansible_search_path' from source: unknown 25201 1726882707.87121: we have included files to process 25201 1726882707.87122: generating all_blocks data 25201 1726882707.87125: done generating all_blocks data 25201 1726882707.87137: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882707.87138: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882707.87141: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25201 1726882707.87260: in VariableManager get_vars() 25201 1726882707.87294: done with get_vars() 25201 1726882707.87425: done processing included file 25201 1726882707.87427: iterating over new_blocks loaded from include file 25201 1726882707.87435: in VariableManager get_vars() 25201 1726882707.87455: done with get_vars() 25201 1726882707.87457: filtering new block on tags 25201 1726882707.87486: done filtering new block on tags 25201 1726882707.87488: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25201 1726882707.87499: extending task lists for all hosts with included blocks 25201 1726882707.88001: done extending task lists 25201 1726882707.88003: done processing included files 25201 1726882707.88004: results queue empty 25201 1726882707.88004: checking for any_errors_fatal 25201 1726882707.88007: done checking for any_errors_fatal 25201 1726882707.88008: checking for max_fail_percentage 25201 1726882707.88009: done checking for max_fail_percentage 25201 1726882707.88010: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.88010: done checking to see if all hosts have failed 25201 1726882707.88011: getting the remaining hosts for this loop 25201 1726882707.88012: done getting the remaining hosts for this loop 25201 1726882707.88015: getting the next task for host managed_node2 25201 1726882707.88019: done getting next task for host managed_node2 25201 1726882707.88021: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25201 1726882707.88024: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.88028: getting variables 25201 1726882707.88028: in VariableManager get_vars() 25201 1726882707.88042: Calling all_inventory to load vars for managed_node2 25201 1726882707.88045: Calling groups_inventory to load vars for managed_node2 25201 1726882707.88047: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.88053: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.88055: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.88058: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.89322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.91136: done with get_vars() 25201 1726882707.91157: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:38:27 -0400 (0:00:00.103) 0:00:29.087 ****** 25201 1726882707.91249: entering _queue_task() for managed_node2/include_tasks 25201 1726882707.91678: worker is 1 (out of 1 available) 25201 1726882707.91704: exiting _queue_task() for managed_node2/include_tasks 25201 1726882707.91726: done queuing things up, now waiting for results queue to drain 25201 1726882707.91727: waiting for pending results... 25201 1726882707.92106: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25201 1726882707.92244: in run() - task 0e448fcc-3ce9-313b-197e-0000000006e4 25201 1726882707.92262: variable 'ansible_search_path' from source: unknown 25201 1726882707.92273: variable 'ansible_search_path' from source: unknown 25201 1726882707.92324: calling self._execute() 25201 1726882707.92422: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882707.92434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882707.92449: variable 'omit' from source: magic vars 25201 1726882707.92923: variable 'ansible_distribution_major_version' from source: facts 25201 1726882707.92945: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882707.92956: _execute() done 25201 1726882707.92982: dumping result to json 25201 1726882707.92991: done dumping result, returning 25201 1726882707.93003: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-313b-197e-0000000006e4] 25201 1726882707.93013: sending task result for task 0e448fcc-3ce9-313b-197e-0000000006e4 25201 1726882707.93127: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000006e4 25201 1726882707.93134: WORKER PROCESS EXITING 25201 1726882707.93174: no more pending results, returning what we have 25201 1726882707.93179: in VariableManager get_vars() 25201 1726882707.93225: Calling all_inventory to load vars for managed_node2 25201 1726882707.93228: Calling groups_inventory to load vars for managed_node2 25201 1726882707.93231: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.93244: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.93248: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.93251: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.94991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882707.96565: done with get_vars() 25201 1726882707.96582: variable 'ansible_search_path' from source: unknown 25201 1726882707.96583: variable 'ansible_search_path' from source: unknown 25201 1726882707.96634: we have included files to process 25201 1726882707.96635: generating all_blocks data 25201 1726882707.96636: done generating all_blocks data 25201 1726882707.96638: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882707.96639: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882707.96640: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25201 1726882707.96897: done processing included file 25201 1726882707.96899: iterating over new_blocks loaded from include file 25201 1726882707.96901: in VariableManager get_vars() 25201 1726882707.96922: done with get_vars() 25201 1726882707.96924: filtering new block on tags 25201 1726882707.96942: done filtering new block on tags 25201 1726882707.96944: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25201 1726882707.96949: extending task lists for all hosts with included blocks 25201 1726882707.97099: done extending task lists 25201 1726882707.97101: done processing included files 25201 1726882707.97102: results queue empty 25201 1726882707.97103: checking for any_errors_fatal 25201 1726882707.97107: done checking for any_errors_fatal 25201 1726882707.97108: checking for max_fail_percentage 25201 1726882707.97109: done checking for max_fail_percentage 25201 1726882707.97110: checking to see if all hosts have failed and the running result is not ok 25201 1726882707.97110: done checking to see if all hosts have failed 25201 1726882707.97111: getting the remaining hosts for this loop 25201 1726882707.97112: done getting the remaining hosts for this loop 25201 1726882707.97115: getting the next task for host managed_node2 25201 1726882707.97119: done getting next task for host managed_node2 25201 1726882707.97121: ^ task is: TASK: Gather current interface info 25201 1726882707.97124: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882707.97126: getting variables 25201 1726882707.97127: in VariableManager get_vars() 25201 1726882707.97141: Calling all_inventory to load vars for managed_node2 25201 1726882707.97143: Calling groups_inventory to load vars for managed_node2 25201 1726882707.97145: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882707.97150: Calling all_plugins_play to load vars for managed_node2 25201 1726882707.97153: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882707.97155: Calling groups_plugins_play to load vars for managed_node2 25201 1726882707.98357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882708.00041: done with get_vars() 25201 1726882708.00066: done getting variables 25201 1726882708.00099: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:38:28 -0400 (0:00:00.088) 0:00:29.175 ****** 25201 1726882708.00121: entering _queue_task() for managed_node2/command 25201 1726882708.00352: worker is 1 (out of 1 available) 25201 1726882708.00367: exiting _queue_task() for managed_node2/command 25201 1726882708.00378: done queuing things up, now waiting for results queue to drain 25201 1726882708.00380: waiting for pending results... 25201 1726882708.00560: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25201 1726882708.00636: in run() - task 0e448fcc-3ce9-313b-197e-00000000071b 25201 1726882708.00647: variable 'ansible_search_path' from source: unknown 25201 1726882708.00651: variable 'ansible_search_path' from source: unknown 25201 1726882708.00685: calling self._execute() 25201 1726882708.00750: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.00753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.00762: variable 'omit' from source: magic vars 25201 1726882708.01167: variable 'ansible_distribution_major_version' from source: facts 25201 1726882708.01177: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882708.01183: variable 'omit' from source: magic vars 25201 1726882708.01218: variable 'omit' from source: magic vars 25201 1726882708.01255: variable 'omit' from source: magic vars 25201 1726882708.01290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882708.01315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882708.01331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882708.01344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.01354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.01382: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882708.01385: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.01389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.01498: Set connection var ansible_shell_executable to /bin/sh 25201 1726882708.01501: Set connection var ansible_pipelining to False 25201 1726882708.01504: Set connection var ansible_connection to ssh 25201 1726882708.01506: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882708.01509: Set connection var ansible_shell_type to sh 25201 1726882708.01511: Set connection var ansible_timeout to 10 25201 1726882708.01513: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.01516: variable 'ansible_connection' from source: unknown 25201 1726882708.01519: variable 'ansible_module_compression' from source: unknown 25201 1726882708.01521: variable 'ansible_shell_type' from source: unknown 25201 1726882708.01523: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.01526: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.01528: variable 'ansible_pipelining' from source: unknown 25201 1726882708.01530: variable 'ansible_timeout' from source: unknown 25201 1726882708.01532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.01628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882708.01637: variable 'omit' from source: magic vars 25201 1726882708.01642: starting attempt loop 25201 1726882708.01645: running the handler 25201 1726882708.01657: _low_level_execute_command(): starting 25201 1726882708.01668: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882708.02241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.02245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.02278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882708.02282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.02285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.02339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.02342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.02344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.02451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.04121: stdout chunk (state=3): >>>/root <<< 25201 1726882708.04220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.04273: stderr chunk (state=3): >>><<< 25201 1726882708.04276: stdout chunk (state=3): >>><<< 25201 1726882708.04289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.04299: _low_level_execute_command(): starting 25201 1726882708.04304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611 `" && echo ansible-tmp-1726882708.042887-26482-153734522433611="` echo /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611 `" ) && sleep 0' 25201 1726882708.04723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.04732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.04738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.04750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.04782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.04792: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.04797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.04808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.04816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.04821: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.04828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.04838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.04843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.04894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.04911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.05017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.06884: stdout chunk (state=3): >>>ansible-tmp-1726882708.042887-26482-153734522433611=/root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611 <<< 25201 1726882708.06992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.07040: stderr chunk (state=3): >>><<< 25201 1726882708.07043: stdout chunk (state=3): >>><<< 25201 1726882708.07055: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882708.042887-26482-153734522433611=/root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.07080: variable 'ansible_module_compression' from source: unknown 25201 1726882708.07119: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882708.07147: variable 'ansible_facts' from source: unknown 25201 1726882708.07203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/AnsiballZ_command.py 25201 1726882708.07297: Sending initial data 25201 1726882708.07300: Sent initial data (155 bytes) 25201 1726882708.07938: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.07944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.07990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.07993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.07996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.08047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.08051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.08155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.09877: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25201 1726882708.09884: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 25201 1726882708.09891: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 25201 1726882708.09897: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 25201 1726882708.09903: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 25201 1726882708.09915: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 25201 1726882708.09921: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 25201 1726882708.09927: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 25201 1726882708.09934: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882708.10039: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 25201 1726882708.10049: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 25201 1726882708.10055: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 25201 1726882708.10166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmprn75plt4 /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/AnsiballZ_command.py <<< 25201 1726882708.10281: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882708.11301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.11391: stderr chunk (state=3): >>><<< 25201 1726882708.11394: stdout chunk (state=3): >>><<< 25201 1726882708.11409: done transferring module to remote 25201 1726882708.11418: _low_level_execute_command(): starting 25201 1726882708.11420: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/ /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/AnsiballZ_command.py && sleep 0' 25201 1726882708.11845: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.11851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.11891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.11897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.11899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882708.11901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.11942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.11945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.12056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.13819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.13860: stderr chunk (state=3): >>><<< 25201 1726882708.13869: stdout chunk (state=3): >>><<< 25201 1726882708.13879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.13882: _low_level_execute_command(): starting 25201 1726882708.13887: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/AnsiballZ_command.py && sleep 0' 25201 1726882708.14287: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.14292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.14323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.14340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.14343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.14392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.14404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.14508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.27923: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:28.273857", "end": "2024-09-20 21:38:28.277235", "delta": "0:00:00.003378", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882708.29172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882708.29204: stderr chunk (state=3): >>><<< 25201 1726882708.29207: stdout chunk (state=3): >>><<< 25201 1726882708.29340: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:28.273857", "end": "2024-09-20 21:38:28.277235", "delta": "0:00:00.003378", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882708.29345: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882708.29348: _low_level_execute_command(): starting 25201 1726882708.29351: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882708.042887-26482-153734522433611/ > /dev/null 2>&1 && sleep 0' 25201 1726882708.29885: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.29900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.29915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.29938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.29983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.29997: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.30012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.30030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.30042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.30053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.30068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.30083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.30099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.30113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.30125: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.30139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.30215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.30232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.30246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.30387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.32208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.32261: stderr chunk (state=3): >>><<< 25201 1726882708.32266: stdout chunk (state=3): >>><<< 25201 1726882708.32285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.32291: handler run complete 25201 1726882708.32318: Evaluated conditional (False): False 25201 1726882708.32327: attempt loop complete, returning result 25201 1726882708.32330: _execute() done 25201 1726882708.32332: dumping result to json 25201 1726882708.32337: done dumping result, returning 25201 1726882708.32346: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-313b-197e-00000000071b] 25201 1726882708.32351: sending task result for task 0e448fcc-3ce9-313b-197e-00000000071b 25201 1726882708.32461: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000071b 25201 1726882708.32466: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003378", "end": "2024-09-20 21:38:28.277235", "rc": 0, "start": "2024-09-20 21:38:28.273857" } STDOUT: bonding_masters eth0 lo veth0 25201 1726882708.32536: no more pending results, returning what we have 25201 1726882708.32540: results queue empty 25201 1726882708.32541: checking for any_errors_fatal 25201 1726882708.32542: done checking for any_errors_fatal 25201 1726882708.32543: checking for max_fail_percentage 25201 1726882708.32545: done checking for max_fail_percentage 25201 1726882708.32546: checking to see if all hosts have failed and the running result is not ok 25201 1726882708.32547: done checking to see if all hosts have failed 25201 1726882708.32547: getting the remaining hosts for this loop 25201 1726882708.32549: done getting the remaining hosts for this loop 25201 1726882708.32553: getting the next task for host managed_node2 25201 1726882708.32560: done getting next task for host managed_node2 25201 1726882708.32563: ^ task is: TASK: Set current_interfaces 25201 1726882708.32570: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882708.32575: getting variables 25201 1726882708.32577: in VariableManager get_vars() 25201 1726882708.32613: Calling all_inventory to load vars for managed_node2 25201 1726882708.32615: Calling groups_inventory to load vars for managed_node2 25201 1726882708.32617: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882708.32627: Calling all_plugins_play to load vars for managed_node2 25201 1726882708.32629: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882708.32632: Calling groups_plugins_play to load vars for managed_node2 25201 1726882708.34071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882708.35815: done with get_vars() 25201 1726882708.35838: done getting variables 25201 1726882708.35906: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:38:28 -0400 (0:00:00.358) 0:00:29.534 ****** 25201 1726882708.35940: entering _queue_task() for managed_node2/set_fact 25201 1726882708.36232: worker is 1 (out of 1 available) 25201 1726882708.36245: exiting _queue_task() for managed_node2/set_fact 25201 1726882708.36258: done queuing things up, now waiting for results queue to drain 25201 1726882708.36260: waiting for pending results... 25201 1726882708.36562: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25201 1726882708.36711: in run() - task 0e448fcc-3ce9-313b-197e-00000000071c 25201 1726882708.36732: variable 'ansible_search_path' from source: unknown 25201 1726882708.36740: variable 'ansible_search_path' from source: unknown 25201 1726882708.36788: calling self._execute() 25201 1726882708.36895: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.36907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.36918: variable 'omit' from source: magic vars 25201 1726882708.37309: variable 'ansible_distribution_major_version' from source: facts 25201 1726882708.37329: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882708.37340: variable 'omit' from source: magic vars 25201 1726882708.37403: variable 'omit' from source: magic vars 25201 1726882708.37520: variable '_current_interfaces' from source: set_fact 25201 1726882708.37591: variable 'omit' from source: magic vars 25201 1726882708.37652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882708.37692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882708.37708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882708.37721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.37731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.37770: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882708.37774: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.37776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.37852: Set connection var ansible_shell_executable to /bin/sh 25201 1726882708.37855: Set connection var ansible_pipelining to False 25201 1726882708.37860: Set connection var ansible_connection to ssh 25201 1726882708.37869: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882708.37873: Set connection var ansible_shell_type to sh 25201 1726882708.37883: Set connection var ansible_timeout to 10 25201 1726882708.37899: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.37904: variable 'ansible_connection' from source: unknown 25201 1726882708.37906: variable 'ansible_module_compression' from source: unknown 25201 1726882708.37908: variable 'ansible_shell_type' from source: unknown 25201 1726882708.37911: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.37913: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.37915: variable 'ansible_pipelining' from source: unknown 25201 1726882708.37917: variable 'ansible_timeout' from source: unknown 25201 1726882708.37921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.38026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882708.38032: variable 'omit' from source: magic vars 25201 1726882708.38037: starting attempt loop 25201 1726882708.38040: running the handler 25201 1726882708.38049: handler run complete 25201 1726882708.38058: attempt loop complete, returning result 25201 1726882708.38061: _execute() done 25201 1726882708.38065: dumping result to json 25201 1726882708.38069: done dumping result, returning 25201 1726882708.38076: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-313b-197e-00000000071c] 25201 1726882708.38081: sending task result for task 0e448fcc-3ce9-313b-197e-00000000071c 25201 1726882708.38160: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000071c 25201 1726882708.38163: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 25201 1726882708.38221: no more pending results, returning what we have 25201 1726882708.38224: results queue empty 25201 1726882708.38225: checking for any_errors_fatal 25201 1726882708.38231: done checking for any_errors_fatal 25201 1726882708.38232: checking for max_fail_percentage 25201 1726882708.38234: done checking for max_fail_percentage 25201 1726882708.38234: checking to see if all hosts have failed and the running result is not ok 25201 1726882708.38235: done checking to see if all hosts have failed 25201 1726882708.38236: getting the remaining hosts for this loop 25201 1726882708.38238: done getting the remaining hosts for this loop 25201 1726882708.38241: getting the next task for host managed_node2 25201 1726882708.38249: done getting next task for host managed_node2 25201 1726882708.38251: ^ task is: TASK: Show current_interfaces 25201 1726882708.38255: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882708.38258: getting variables 25201 1726882708.38259: in VariableManager get_vars() 25201 1726882708.38295: Calling all_inventory to load vars for managed_node2 25201 1726882708.38297: Calling groups_inventory to load vars for managed_node2 25201 1726882708.38299: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882708.38308: Calling all_plugins_play to load vars for managed_node2 25201 1726882708.38311: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882708.38313: Calling groups_plugins_play to load vars for managed_node2 25201 1726882708.44323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882708.46813: done with get_vars() 25201 1726882708.46838: done getting variables 25201 1726882708.46891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:38:28 -0400 (0:00:00.109) 0:00:29.643 ****** 25201 1726882708.46918: entering _queue_task() for managed_node2/debug 25201 1726882708.47394: worker is 1 (out of 1 available) 25201 1726882708.47407: exiting _queue_task() for managed_node2/debug 25201 1726882708.47420: done queuing things up, now waiting for results queue to drain 25201 1726882708.47422: waiting for pending results... 25201 1726882708.47730: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25201 1726882708.47861: in run() - task 0e448fcc-3ce9-313b-197e-0000000006e5 25201 1726882708.47887: variable 'ansible_search_path' from source: unknown 25201 1726882708.47896: variable 'ansible_search_path' from source: unknown 25201 1726882708.47936: calling self._execute() 25201 1726882708.48038: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.48050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.48068: variable 'omit' from source: magic vars 25201 1726882708.48478: variable 'ansible_distribution_major_version' from source: facts 25201 1726882708.48497: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882708.48511: variable 'omit' from source: magic vars 25201 1726882708.48565: variable 'omit' from source: magic vars 25201 1726882708.48670: variable 'current_interfaces' from source: set_fact 25201 1726882708.48705: variable 'omit' from source: magic vars 25201 1726882708.48751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882708.48793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882708.48817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882708.48840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.48863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.48901: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882708.48911: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.48920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.49030: Set connection var ansible_shell_executable to /bin/sh 25201 1726882708.49042: Set connection var ansible_pipelining to False 25201 1726882708.49052: Set connection var ansible_connection to ssh 25201 1726882708.49067: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882708.49080: Set connection var ansible_shell_type to sh 25201 1726882708.49094: Set connection var ansible_timeout to 10 25201 1726882708.49133: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.49152: variable 'ansible_connection' from source: unknown 25201 1726882708.49160: variable 'ansible_module_compression' from source: unknown 25201 1726882708.49171: variable 'ansible_shell_type' from source: unknown 25201 1726882708.49179: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.49186: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.49194: variable 'ansible_pipelining' from source: unknown 25201 1726882708.49202: variable 'ansible_timeout' from source: unknown 25201 1726882708.49213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.49362: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882708.49381: variable 'omit' from source: magic vars 25201 1726882708.49395: starting attempt loop 25201 1726882708.49403: running the handler 25201 1726882708.49458: handler run complete 25201 1726882708.49481: attempt loop complete, returning result 25201 1726882708.49489: _execute() done 25201 1726882708.49499: dumping result to json 25201 1726882708.49507: done dumping result, returning 25201 1726882708.49522: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-313b-197e-0000000006e5] 25201 1726882708.49534: sending task result for task 0e448fcc-3ce9-313b-197e-0000000006e5 ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 25201 1726882708.49697: no more pending results, returning what we have 25201 1726882708.49700: results queue empty 25201 1726882708.49701: checking for any_errors_fatal 25201 1726882708.49710: done checking for any_errors_fatal 25201 1726882708.49711: checking for max_fail_percentage 25201 1726882708.49712: done checking for max_fail_percentage 25201 1726882708.49713: checking to see if all hosts have failed and the running result is not ok 25201 1726882708.49714: done checking to see if all hosts have failed 25201 1726882708.49715: getting the remaining hosts for this loop 25201 1726882708.49716: done getting the remaining hosts for this loop 25201 1726882708.49721: getting the next task for host managed_node2 25201 1726882708.49730: done getting next task for host managed_node2 25201 1726882708.49733: ^ task is: TASK: Install iproute 25201 1726882708.49737: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882708.49740: getting variables 25201 1726882708.49742: in VariableManager get_vars() 25201 1726882708.49787: Calling all_inventory to load vars for managed_node2 25201 1726882708.49790: Calling groups_inventory to load vars for managed_node2 25201 1726882708.49792: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882708.49805: Calling all_plugins_play to load vars for managed_node2 25201 1726882708.49808: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882708.49811: Calling groups_plugins_play to load vars for managed_node2 25201 1726882708.51186: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000006e5 25201 1726882708.51190: WORKER PROCESS EXITING 25201 1726882708.51527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882708.53662: done with get_vars() 25201 1726882708.53690: done getting variables 25201 1726882708.53749: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:38:28 -0400 (0:00:00.068) 0:00:29.712 ****** 25201 1726882708.53801: entering _queue_task() for managed_node2/package 25201 1726882708.54582: worker is 1 (out of 1 available) 25201 1726882708.54594: exiting _queue_task() for managed_node2/package 25201 1726882708.54607: done queuing things up, now waiting for results queue to drain 25201 1726882708.54608: waiting for pending results... 25201 1726882708.54756: running TaskExecutor() for managed_node2/TASK: Install iproute 25201 1726882708.54845: in run() - task 0e448fcc-3ce9-313b-197e-0000000005cf 25201 1726882708.54859: variable 'ansible_search_path' from source: unknown 25201 1726882708.54863: variable 'ansible_search_path' from source: unknown 25201 1726882708.54905: calling self._execute() 25201 1726882708.55010: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.55016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.55026: variable 'omit' from source: magic vars 25201 1726882708.55429: variable 'ansible_distribution_major_version' from source: facts 25201 1726882708.55442: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882708.55449: variable 'omit' from source: magic vars 25201 1726882708.55495: variable 'omit' from source: magic vars 25201 1726882708.55696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25201 1726882708.59027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25201 1726882708.59094: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25201 1726882708.59131: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25201 1726882708.59182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25201 1726882708.59208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25201 1726882708.59297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25201 1726882708.59323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25201 1726882708.59350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25201 1726882708.59392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25201 1726882708.59408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25201 1726882708.59510: variable '__network_is_ostree' from source: set_fact 25201 1726882708.59513: variable 'omit' from source: magic vars 25201 1726882708.59541: variable 'omit' from source: magic vars 25201 1726882708.59570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882708.59597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882708.59614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882708.59632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.59640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882708.59672: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882708.59677: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.59680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.59778: Set connection var ansible_shell_executable to /bin/sh 25201 1726882708.59784: Set connection var ansible_pipelining to False 25201 1726882708.59790: Set connection var ansible_connection to ssh 25201 1726882708.59793: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882708.59796: Set connection var ansible_shell_type to sh 25201 1726882708.59805: Set connection var ansible_timeout to 10 25201 1726882708.59828: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.59831: variable 'ansible_connection' from source: unknown 25201 1726882708.59833: variable 'ansible_module_compression' from source: unknown 25201 1726882708.59836: variable 'ansible_shell_type' from source: unknown 25201 1726882708.59838: variable 'ansible_shell_executable' from source: unknown 25201 1726882708.59840: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882708.59845: variable 'ansible_pipelining' from source: unknown 25201 1726882708.59847: variable 'ansible_timeout' from source: unknown 25201 1726882708.59850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882708.59952: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882708.59962: variable 'omit' from source: magic vars 25201 1726882708.59973: starting attempt loop 25201 1726882708.59976: running the handler 25201 1726882708.59981: variable 'ansible_facts' from source: unknown 25201 1726882708.59984: variable 'ansible_facts' from source: unknown 25201 1726882708.60019: _low_level_execute_command(): starting 25201 1726882708.60022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882708.60716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.60729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.60738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.60752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.60793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.60801: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.60811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.60824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.60831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.60839: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.60844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.60854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.60869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.60878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.60886: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.60894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.60977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.61022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.61025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.61333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.62923: stdout chunk (state=3): >>>/root <<< 25201 1726882708.63071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.63514: stderr chunk (state=3): >>><<< 25201 1726882708.63524: stdout chunk (state=3): >>><<< 25201 1726882708.63558: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.63575: _low_level_execute_command(): starting 25201 1726882708.63586: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734 `" && echo ansible-tmp-1726882708.6355157-26512-180874495556734="` echo /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734 `" ) && sleep 0' 25201 1726882708.64349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.64357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.64389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.64400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.64446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.64460: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.64478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.64499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.64514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.64517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.64532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.64542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.64561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.64582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.64588: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.64616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.64689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.64703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.64714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.64874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.66728: stdout chunk (state=3): >>>ansible-tmp-1726882708.6355157-26512-180874495556734=/root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734 <<< 25201 1726882708.66839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.66911: stderr chunk (state=3): >>><<< 25201 1726882708.66914: stdout chunk (state=3): >>><<< 25201 1726882708.66936: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882708.6355157-26512-180874495556734=/root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.66969: variable 'ansible_module_compression' from source: unknown 25201 1726882708.67033: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25201 1726882708.67074: variable 'ansible_facts' from source: unknown 25201 1726882708.67185: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/AnsiballZ_dnf.py 25201 1726882708.67321: Sending initial data 25201 1726882708.67324: Sent initial data (152 bytes) 25201 1726882708.68361: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.68384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.68417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.68421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.68470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.68504: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.68507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.68517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.68520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.68522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.68539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.68541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.68551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.68558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.68564: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.68578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.68648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.68662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.68686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.69087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.70721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882708.70821: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882708.70921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpgndckuow /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/AnsiballZ_dnf.py <<< 25201 1726882708.71017: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882708.72816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.72989: stderr chunk (state=3): >>><<< 25201 1726882708.72992: stdout chunk (state=3): >>><<< 25201 1726882708.73013: done transferring module to remote 25201 1726882708.73026: _low_level_execute_command(): starting 25201 1726882708.73039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/ /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/AnsiballZ_dnf.py && sleep 0' 25201 1726882708.73728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.73736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.73794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.73798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.73825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.73842: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.73845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.73852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.73885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.73888: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.73899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.73902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.73928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.73931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.73934: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.73958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.74025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.74053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.74056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.74203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882708.76041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882708.76522: stderr chunk (state=3): >>><<< 25201 1726882708.76591: stdout chunk (state=3): >>><<< 25201 1726882708.76639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882708.76677: _low_level_execute_command(): starting 25201 1726882708.76688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/AnsiballZ_dnf.py && sleep 0' 25201 1726882708.77653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882708.77669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.77688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.77709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.77752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.77765: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882708.77781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.77802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882708.77814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882708.77829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882708.77841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882708.77855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882708.77873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882708.77885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882708.77895: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882708.77911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882708.78167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882708.78211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882708.78248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882708.78492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882709.80485: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25201 1726882709.86248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882709.86302: stderr chunk (state=3): >>><<< 25201 1726882709.86328: stdout chunk (state=3): >>><<< 25201 1726882709.86393: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882709.86523: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882709.86528: _low_level_execute_command(): starting 25201 1726882709.86530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882708.6355157-26512-180874495556734/ > /dev/null 2>&1 && sleep 0' 25201 1726882709.87875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882709.87884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882709.87895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882709.87909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882709.87951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882709.87958: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882709.87972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882709.87987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882709.87992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882709.88000: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882709.88011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882709.88024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882709.88039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882709.88048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882709.88057: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882709.88070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882709.88141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882709.88155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882709.88162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882709.88384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882709.90246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882709.90250: stdout chunk (state=3): >>><<< 25201 1726882709.90259: stderr chunk (state=3): >>><<< 25201 1726882709.90278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882709.90285: handler run complete 25201 1726882709.90447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25201 1726882709.90654: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25201 1726882709.90701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25201 1726882709.90732: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25201 1726882709.90760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25201 1726882709.90837: variable '__install_status' from source: set_fact 25201 1726882709.90857: Evaluated conditional (__install_status is success): True 25201 1726882709.90874: attempt loop complete, returning result 25201 1726882709.90877: _execute() done 25201 1726882709.90880: dumping result to json 25201 1726882709.90885: done dumping result, returning 25201 1726882709.90892: done running TaskExecutor() for managed_node2/TASK: Install iproute [0e448fcc-3ce9-313b-197e-0000000005cf] 25201 1726882709.90897: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cf 25201 1726882709.91065: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005cf 25201 1726882709.91069: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25201 1726882709.91248: no more pending results, returning what we have 25201 1726882709.91252: results queue empty 25201 1726882709.91253: checking for any_errors_fatal 25201 1726882709.91258: done checking for any_errors_fatal 25201 1726882709.91259: checking for max_fail_percentage 25201 1726882709.91261: done checking for max_fail_percentage 25201 1726882709.91262: checking to see if all hosts have failed and the running result is not ok 25201 1726882709.91263: done checking to see if all hosts have failed 25201 1726882709.91265: getting the remaining hosts for this loop 25201 1726882709.91267: done getting the remaining hosts for this loop 25201 1726882709.91288: getting the next task for host managed_node2 25201 1726882709.91297: done getting next task for host managed_node2 25201 1726882709.91299: ^ task is: TASK: Create veth interface {{ interface }} 25201 1726882709.91302: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882709.91306: getting variables 25201 1726882709.91308: in VariableManager get_vars() 25201 1726882709.91346: Calling all_inventory to load vars for managed_node2 25201 1726882709.91349: Calling groups_inventory to load vars for managed_node2 25201 1726882709.91352: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882709.91363: Calling all_plugins_play to load vars for managed_node2 25201 1726882709.91368: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882709.91371: Calling groups_plugins_play to load vars for managed_node2 25201 1726882709.93878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882709.97333: done with get_vars() 25201 1726882709.97356: done getting variables 25201 1726882709.97417: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882709.97537: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:38:29 -0400 (0:00:01.437) 0:00:31.156 ****** 25201 1726882709.98174: entering _queue_task() for managed_node2/command 25201 1726882709.98765: worker is 1 (out of 1 available) 25201 1726882709.98780: exiting _queue_task() for managed_node2/command 25201 1726882709.98792: done queuing things up, now waiting for results queue to drain 25201 1726882709.98794: waiting for pending results... 25201 1726882709.98971: running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 25201 1726882709.99089: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d0 25201 1726882709.99106: variable 'ansible_search_path' from source: unknown 25201 1726882709.99113: variable 'ansible_search_path' from source: unknown 25201 1726882709.99398: variable 'interface' from source: play vars 25201 1726882709.99494: variable 'interface' from source: play vars 25201 1726882709.99574: variable 'interface' from source: play vars 25201 1726882709.99736: Loaded config def from plugin (lookup/items) 25201 1726882709.99749: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25201 1726882709.99782: variable 'omit' from source: magic vars 25201 1726882709.99938: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882709.99954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882709.99977: variable 'omit' from source: magic vars 25201 1726882710.00232: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.00250: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.00496: variable 'type' from source: play vars 25201 1726882710.00516: variable 'state' from source: include params 25201 1726882710.00749: variable 'interface' from source: play vars 25201 1726882710.00759: variable 'current_interfaces' from source: set_fact 25201 1726882710.00776: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25201 1726882710.00785: when evaluation is False, skipping this task 25201 1726882710.00818: variable 'item' from source: unknown 25201 1726882710.00899: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 25201 1726882710.01147: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.01166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.01190: variable 'omit' from source: magic vars 25201 1726882710.01351: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.01362: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.01561: variable 'type' from source: play vars 25201 1726882710.01576: variable 'state' from source: include params 25201 1726882710.01584: variable 'interface' from source: play vars 25201 1726882710.01591: variable 'current_interfaces' from source: set_fact 25201 1726882710.01600: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25201 1726882710.01606: when evaluation is False, skipping this task 25201 1726882710.01641: variable 'item' from source: unknown 25201 1726882710.01708: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 25201 1726882710.01854: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.01875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.01890: variable 'omit' from source: magic vars 25201 1726882710.02071: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.02082: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.02338: variable 'type' from source: play vars 25201 1726882710.02348: variable 'state' from source: include params 25201 1726882710.02356: variable 'interface' from source: play vars 25201 1726882710.02369: variable 'current_interfaces' from source: set_fact 25201 1726882710.02380: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25201 1726882710.02387: when evaluation is False, skipping this task 25201 1726882710.02450: variable 'item' from source: unknown 25201 1726882710.02543: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 25201 1726882710.02631: dumping result to json 25201 1726882710.02649: done dumping result, returning 25201 1726882710.02659: done running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d0] 25201 1726882710.02675: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d0 skipping: [managed_node2] => { "changed": false } MSG: All items skipped 25201 1726882710.02776: no more pending results, returning what we have 25201 1726882710.02780: results queue empty 25201 1726882710.02781: checking for any_errors_fatal 25201 1726882710.02791: done checking for any_errors_fatal 25201 1726882710.02791: checking for max_fail_percentage 25201 1726882710.02793: done checking for max_fail_percentage 25201 1726882710.02794: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.02795: done checking to see if all hosts have failed 25201 1726882710.02795: getting the remaining hosts for this loop 25201 1726882710.02797: done getting the remaining hosts for this loop 25201 1726882710.02800: getting the next task for host managed_node2 25201 1726882710.02806: done getting next task for host managed_node2 25201 1726882710.02808: ^ task is: TASK: Set up veth as managed by NetworkManager 25201 1726882710.02811: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.02814: getting variables 25201 1726882710.02816: in VariableManager get_vars() 25201 1726882710.02852: Calling all_inventory to load vars for managed_node2 25201 1726882710.02855: Calling groups_inventory to load vars for managed_node2 25201 1726882710.02857: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.02870: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.02873: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.02876: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.03394: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d0 25201 1726882710.03398: WORKER PROCESS EXITING 25201 1726882710.04439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.07038: done with get_vars() 25201 1726882710.07284: done getting variables 25201 1726882710.07341: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:38:30 -0400 (0:00:00.092) 0:00:31.248 ****** 25201 1726882710.07496: entering _queue_task() for managed_node2/command 25201 1726882710.08267: worker is 1 (out of 1 available) 25201 1726882710.08283: exiting _queue_task() for managed_node2/command 25201 1726882710.08299: done queuing things up, now waiting for results queue to drain 25201 1726882710.08301: waiting for pending results... 25201 1726882710.08632: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 25201 1726882710.08745: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d1 25201 1726882710.08770: variable 'ansible_search_path' from source: unknown 25201 1726882710.08779: variable 'ansible_search_path' from source: unknown 25201 1726882710.08827: calling self._execute() 25201 1726882710.08937: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.08948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.08968: variable 'omit' from source: magic vars 25201 1726882710.09384: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.09405: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.09590: variable 'type' from source: play vars 25201 1726882710.09600: variable 'state' from source: include params 25201 1726882710.09608: Evaluated conditional (type == 'veth' and state == 'present'): False 25201 1726882710.09615: when evaluation is False, skipping this task 25201 1726882710.09621: _execute() done 25201 1726882710.09632: dumping result to json 25201 1726882710.09638: done dumping result, returning 25201 1726882710.09647: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-313b-197e-0000000005d1] 25201 1726882710.09655: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d1 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 25201 1726882710.09808: no more pending results, returning what we have 25201 1726882710.09812: results queue empty 25201 1726882710.09813: checking for any_errors_fatal 25201 1726882710.09824: done checking for any_errors_fatal 25201 1726882710.09825: checking for max_fail_percentage 25201 1726882710.09827: done checking for max_fail_percentage 25201 1726882710.09828: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.09828: done checking to see if all hosts have failed 25201 1726882710.09829: getting the remaining hosts for this loop 25201 1726882710.09831: done getting the remaining hosts for this loop 25201 1726882710.09835: getting the next task for host managed_node2 25201 1726882710.09842: done getting next task for host managed_node2 25201 1726882710.09845: ^ task is: TASK: Delete veth interface {{ interface }} 25201 1726882710.09849: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.09853: getting variables 25201 1726882710.09855: in VariableManager get_vars() 25201 1726882710.09900: Calling all_inventory to load vars for managed_node2 25201 1726882710.09904: Calling groups_inventory to load vars for managed_node2 25201 1726882710.09906: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.09919: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.09922: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.09925: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.10920: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d1 25201 1726882710.10923: WORKER PROCESS EXITING 25201 1726882710.11751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.13909: done with get_vars() 25201 1726882710.13937: done getting variables 25201 1726882710.14001: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882710.14127: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:38:30 -0400 (0:00:00.067) 0:00:31.316 ****** 25201 1726882710.14166: entering _queue_task() for managed_node2/command 25201 1726882710.14433: worker is 1 (out of 1 available) 25201 1726882710.14445: exiting _queue_task() for managed_node2/command 25201 1726882710.14455: done queuing things up, now waiting for results queue to drain 25201 1726882710.14572: waiting for pending results... 25201 1726882710.15329: running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 25201 1726882710.15421: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d2 25201 1726882710.15428: variable 'ansible_search_path' from source: unknown 25201 1726882710.15433: variable 'ansible_search_path' from source: unknown 25201 1726882710.15471: calling self._execute() 25201 1726882710.15569: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.15573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.15584: variable 'omit' from source: magic vars 25201 1726882710.15992: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.16004: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.16201: variable 'type' from source: play vars 25201 1726882710.16204: variable 'state' from source: include params 25201 1726882710.16210: variable 'interface' from source: play vars 25201 1726882710.16214: variable 'current_interfaces' from source: set_fact 25201 1726882710.16223: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 25201 1726882710.16226: variable 'omit' from source: magic vars 25201 1726882710.16267: variable 'omit' from source: magic vars 25201 1726882710.16356: variable 'interface' from source: play vars 25201 1726882710.16376: variable 'omit' from source: magic vars 25201 1726882710.16414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882710.16447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882710.16469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882710.16485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882710.16496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882710.16524: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882710.16528: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.16530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.16633: Set connection var ansible_shell_executable to /bin/sh 25201 1726882710.16637: Set connection var ansible_pipelining to False 25201 1726882710.16643: Set connection var ansible_connection to ssh 25201 1726882710.16648: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882710.16652: Set connection var ansible_shell_type to sh 25201 1726882710.16660: Set connection var ansible_timeout to 10 25201 1726882710.16684: variable 'ansible_shell_executable' from source: unknown 25201 1726882710.16687: variable 'ansible_connection' from source: unknown 25201 1726882710.16690: variable 'ansible_module_compression' from source: unknown 25201 1726882710.16692: variable 'ansible_shell_type' from source: unknown 25201 1726882710.16695: variable 'ansible_shell_executable' from source: unknown 25201 1726882710.16697: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.16699: variable 'ansible_pipelining' from source: unknown 25201 1726882710.16703: variable 'ansible_timeout' from source: unknown 25201 1726882710.16707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.16839: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882710.16848: variable 'omit' from source: magic vars 25201 1726882710.16851: starting attempt loop 25201 1726882710.16853: running the handler 25201 1726882710.16873: _low_level_execute_command(): starting 25201 1726882710.16885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882710.17797: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.17808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.17819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.17833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.17872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.17879: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.17889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.17902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.17911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.17917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.17925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.17934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.17945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.17952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.17959: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.17973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.18044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.18067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.18079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.18209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.19869: stdout chunk (state=3): >>>/root <<< 25201 1726882710.20048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882710.20053: stdout chunk (state=3): >>><<< 25201 1726882710.20062: stderr chunk (state=3): >>><<< 25201 1726882710.20087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882710.20097: _low_level_execute_command(): starting 25201 1726882710.20103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408 `" && echo ansible-tmp-1726882710.200845-26572-29009040504408="` echo /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408 `" ) && sleep 0' 25201 1726882710.21639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.21643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.21645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.21648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.22347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.22354: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.22367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.22378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.22385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.22391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.22398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.22406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.22417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.22423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.22429: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.22439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.22516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.22584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.22594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.22716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.24589: stdout chunk (state=3): >>>ansible-tmp-1726882710.200845-26572-29009040504408=/root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408 <<< 25201 1726882710.24766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882710.24770: stdout chunk (state=3): >>><<< 25201 1726882710.24775: stderr chunk (state=3): >>><<< 25201 1726882710.24791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882710.200845-26572-29009040504408=/root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882710.24821: variable 'ansible_module_compression' from source: unknown 25201 1726882710.24873: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882710.24904: variable 'ansible_facts' from source: unknown 25201 1726882710.24989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/AnsiballZ_command.py 25201 1726882710.25119: Sending initial data 25201 1726882710.25123: Sent initial data (154 bytes) 25201 1726882710.26080: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.26090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.26096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.26109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.26144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.26150: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.26160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.26175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.26182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.26190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.26200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.26205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.26216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.26223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.26229: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.26238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.26313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.26325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.26336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.27085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.28827: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882710.28922: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882710.29024: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpduri1lv6 /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/AnsiballZ_command.py <<< 25201 1726882710.29120: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882710.30643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882710.30725: stderr chunk (state=3): >>><<< 25201 1726882710.30728: stdout chunk (state=3): >>><<< 25201 1726882710.30749: done transferring module to remote 25201 1726882710.30760: _low_level_execute_command(): starting 25201 1726882710.30768: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/ /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/AnsiballZ_command.py && sleep 0' 25201 1726882710.32481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.32491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.32502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.32516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.32554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.32561: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.32572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.32586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.32593: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.32599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.32606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.32615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.32626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.32635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.32639: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.32648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.32721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.32735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.32748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.33491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.35330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882710.35333: stdout chunk (state=3): >>><<< 25201 1726882710.35341: stderr chunk (state=3): >>><<< 25201 1726882710.35357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882710.35360: _low_level_execute_command(): starting 25201 1726882710.35370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/AnsiballZ_command.py && sleep 0' 25201 1726882710.37058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.37069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.37081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.37095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.37132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.37140: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.37149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.37161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.37173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.37180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.37187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.37195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.37206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.37212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.37218: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.37227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.37295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.37308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.37318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.38100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.52685: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:38:30.508988", "end": "2024-09-20 21:38:30.522269", "delta": "0:00:00.013281", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882710.53970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882710.53975: stdout chunk (state=3): >>><<< 25201 1726882710.53979: stderr chunk (state=3): >>><<< 25201 1726882710.53997: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:38:30.508988", "end": "2024-09-20 21:38:30.522269", "delta": "0:00:00.013281", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882710.54040: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882710.54047: _low_level_execute_command(): starting 25201 1726882710.54100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882710.200845-26572-29009040504408/ > /dev/null 2>&1 && sleep 0' 25201 1726882710.54784: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882710.54787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.54790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.54792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.54794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.54801: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882710.54803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.54805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882710.54807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882710.55104: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882710.55107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882710.55109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882710.55111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882710.55113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882710.55115: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882710.55117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882710.55119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882710.55121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882710.55122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882710.55474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882710.57183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882710.57186: stdout chunk (state=3): >>><<< 25201 1726882710.57193: stderr chunk (state=3): >>><<< 25201 1726882710.57221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882710.57225: handler run complete 25201 1726882710.57261: Evaluated conditional (False): False 25201 1726882710.57322: attempt loop complete, returning result 25201 1726882710.57327: _execute() done 25201 1726882710.57330: dumping result to json 25201 1726882710.57332: done dumping result, returning 25201 1726882710.57334: done running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d2] 25201 1726882710.57336: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d2 25201 1726882710.57403: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d2 25201 1726882710.57405: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.013281", "end": "2024-09-20 21:38:30.522269", "rc": 0, "start": "2024-09-20 21:38:30.508988" } 25201 1726882710.57475: no more pending results, returning what we have 25201 1726882710.57478: results queue empty 25201 1726882710.57479: checking for any_errors_fatal 25201 1726882710.57487: done checking for any_errors_fatal 25201 1726882710.57487: checking for max_fail_percentage 25201 1726882710.57489: done checking for max_fail_percentage 25201 1726882710.57490: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.57491: done checking to see if all hosts have failed 25201 1726882710.57492: getting the remaining hosts for this loop 25201 1726882710.57493: done getting the remaining hosts for this loop 25201 1726882710.57497: getting the next task for host managed_node2 25201 1726882710.57504: done getting next task for host managed_node2 25201 1726882710.57506: ^ task is: TASK: Create dummy interface {{ interface }} 25201 1726882710.57509: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.57515: getting variables 25201 1726882710.57517: in VariableManager get_vars() 25201 1726882710.57556: Calling all_inventory to load vars for managed_node2 25201 1726882710.57558: Calling groups_inventory to load vars for managed_node2 25201 1726882710.57560: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.57574: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.57577: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.57579: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.60559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.62696: done with get_vars() 25201 1726882710.62720: done getting variables 25201 1726882710.62795: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882710.62929: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:38:30 -0400 (0:00:00.487) 0:00:31.804 ****** 25201 1726882710.62971: entering _queue_task() for managed_node2/command 25201 1726882710.63330: worker is 1 (out of 1 available) 25201 1726882710.63344: exiting _queue_task() for managed_node2/command 25201 1726882710.63355: done queuing things up, now waiting for results queue to drain 25201 1726882710.63356: waiting for pending results... 25201 1726882710.63678: running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 25201 1726882710.63805: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d3 25201 1726882710.63899: variable 'ansible_search_path' from source: unknown 25201 1726882710.63910: variable 'ansible_search_path' from source: unknown 25201 1726882710.63975: calling self._execute() 25201 1726882710.64087: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.64104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.64127: variable 'omit' from source: magic vars 25201 1726882710.64559: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.64582: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.64894: variable 'type' from source: play vars 25201 1726882710.64903: variable 'state' from source: include params 25201 1726882710.64910: variable 'interface' from source: play vars 25201 1726882710.64922: variable 'current_interfaces' from source: set_fact 25201 1726882710.64933: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25201 1726882710.64940: when evaluation is False, skipping this task 25201 1726882710.64946: _execute() done 25201 1726882710.64951: dumping result to json 25201 1726882710.64968: done dumping result, returning 25201 1726882710.64978: done running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d3] 25201 1726882710.64994: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d3 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882710.65148: no more pending results, returning what we have 25201 1726882710.65151: results queue empty 25201 1726882710.65153: checking for any_errors_fatal 25201 1726882710.65166: done checking for any_errors_fatal 25201 1726882710.65167: checking for max_fail_percentage 25201 1726882710.65168: done checking for max_fail_percentage 25201 1726882710.65170: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.65171: done checking to see if all hosts have failed 25201 1726882710.65171: getting the remaining hosts for this loop 25201 1726882710.65173: done getting the remaining hosts for this loop 25201 1726882710.65177: getting the next task for host managed_node2 25201 1726882710.65185: done getting next task for host managed_node2 25201 1726882710.65188: ^ task is: TASK: Delete dummy interface {{ interface }} 25201 1726882710.65191: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.65196: getting variables 25201 1726882710.65198: in VariableManager get_vars() 25201 1726882710.65242: Calling all_inventory to load vars for managed_node2 25201 1726882710.65244: Calling groups_inventory to load vars for managed_node2 25201 1726882710.65247: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.65260: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.65267: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.65270: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.66323: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d3 25201 1726882710.66326: WORKER PROCESS EXITING 25201 1726882710.67957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.72120: done with get_vars() 25201 1726882710.72147: done getting variables 25201 1726882710.72223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882710.72345: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:38:30 -0400 (0:00:00.094) 0:00:31.898 ****** 25201 1726882710.72384: entering _queue_task() for managed_node2/command 25201 1726882710.72721: worker is 1 (out of 1 available) 25201 1726882710.72733: exiting _queue_task() for managed_node2/command 25201 1726882710.72751: done queuing things up, now waiting for results queue to drain 25201 1726882710.72753: waiting for pending results... 25201 1726882710.73049: running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 25201 1726882710.73169: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d4 25201 1726882710.73197: variable 'ansible_search_path' from source: unknown 25201 1726882710.73205: variable 'ansible_search_path' from source: unknown 25201 1726882710.73249: calling self._execute() 25201 1726882710.73358: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.73375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.73393: variable 'omit' from source: magic vars 25201 1726882710.74546: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.74715: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.75512: variable 'type' from source: play vars 25201 1726882710.75528: variable 'state' from source: include params 25201 1726882710.75537: variable 'interface' from source: play vars 25201 1726882710.75545: variable 'current_interfaces' from source: set_fact 25201 1726882710.75555: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25201 1726882710.75561: when evaluation is False, skipping this task 25201 1726882710.75573: _execute() done 25201 1726882710.75580: dumping result to json 25201 1726882710.75591: done dumping result, returning 25201 1726882710.75612: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d4] 25201 1726882710.75724: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d4 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882710.75950: no more pending results, returning what we have 25201 1726882710.75954: results queue empty 25201 1726882710.75955: checking for any_errors_fatal 25201 1726882710.75960: done checking for any_errors_fatal 25201 1726882710.75961: checking for max_fail_percentage 25201 1726882710.75966: done checking for max_fail_percentage 25201 1726882710.75967: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.75968: done checking to see if all hosts have failed 25201 1726882710.75969: getting the remaining hosts for this loop 25201 1726882710.75971: done getting the remaining hosts for this loop 25201 1726882710.75975: getting the next task for host managed_node2 25201 1726882710.75982: done getting next task for host managed_node2 25201 1726882710.75987: ^ task is: TASK: Create tap interface {{ interface }} 25201 1726882710.75990: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.75995: getting variables 25201 1726882710.75997: in VariableManager get_vars() 25201 1726882710.76040: Calling all_inventory to load vars for managed_node2 25201 1726882710.76043: Calling groups_inventory to load vars for managed_node2 25201 1726882710.76046: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.76059: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.76066: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.76069: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.77326: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d4 25201 1726882710.77329: WORKER PROCESS EXITING 25201 1726882710.78327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.84601: done with get_vars() 25201 1726882710.84647: done getting variables 25201 1726882710.84817: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882710.85169: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:38:30 -0400 (0:00:00.128) 0:00:32.026 ****** 25201 1726882710.85223: entering _queue_task() for managed_node2/command 25201 1726882710.86048: worker is 1 (out of 1 available) 25201 1726882710.86062: exiting _queue_task() for managed_node2/command 25201 1726882710.86077: done queuing things up, now waiting for results queue to drain 25201 1726882710.86079: waiting for pending results... 25201 1726882710.87320: running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 25201 1726882710.87671: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d5 25201 1726882710.87697: variable 'ansible_search_path' from source: unknown 25201 1726882710.87706: variable 'ansible_search_path' from source: unknown 25201 1726882710.87751: calling self._execute() 25201 1726882710.88069: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882710.88119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882710.88201: variable 'omit' from source: magic vars 25201 1726882710.89615: variable 'ansible_distribution_major_version' from source: facts 25201 1726882710.89712: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882710.90274: variable 'type' from source: play vars 25201 1726882710.90387: variable 'state' from source: include params 25201 1726882710.90471: variable 'interface' from source: play vars 25201 1726882710.90516: variable 'current_interfaces' from source: set_fact 25201 1726882710.90529: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25201 1726882710.90577: when evaluation is False, skipping this task 25201 1726882710.90585: _execute() done 25201 1726882710.90700: dumping result to json 25201 1726882710.90733: done dumping result, returning 25201 1726882710.90812: done running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d5] 25201 1726882710.90857: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d5 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882710.91174: no more pending results, returning what we have 25201 1726882710.91178: results queue empty 25201 1726882710.91179: checking for any_errors_fatal 25201 1726882710.91184: done checking for any_errors_fatal 25201 1726882710.91185: checking for max_fail_percentage 25201 1726882710.91187: done checking for max_fail_percentage 25201 1726882710.91188: checking to see if all hosts have failed and the running result is not ok 25201 1726882710.91188: done checking to see if all hosts have failed 25201 1726882710.91189: getting the remaining hosts for this loop 25201 1726882710.91191: done getting the remaining hosts for this loop 25201 1726882710.91195: getting the next task for host managed_node2 25201 1726882710.91202: done getting next task for host managed_node2 25201 1726882710.91204: ^ task is: TASK: Delete tap interface {{ interface }} 25201 1726882710.91208: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882710.91212: getting variables 25201 1726882710.91214: in VariableManager get_vars() 25201 1726882710.91258: Calling all_inventory to load vars for managed_node2 25201 1726882710.91273: Calling groups_inventory to load vars for managed_node2 25201 1726882710.91277: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882710.91293: Calling all_plugins_play to load vars for managed_node2 25201 1726882710.91297: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882710.91302: Calling groups_plugins_play to load vars for managed_node2 25201 1726882710.91845: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d5 25201 1726882710.91848: WORKER PROCESS EXITING 25201 1726882710.94314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882710.98152: done with get_vars() 25201 1726882710.98179: done getting variables 25201 1726882710.98267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 25201 1726882710.98380: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:38:30 -0400 (0:00:00.131) 0:00:32.158 ****** 25201 1726882710.98424: entering _queue_task() for managed_node2/command 25201 1726882710.98949: worker is 1 (out of 1 available) 25201 1726882710.98980: exiting _queue_task() for managed_node2/command 25201 1726882710.98993: done queuing things up, now waiting for results queue to drain 25201 1726882710.98995: waiting for pending results... 25201 1726882710.99802: running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 25201 1726882710.99899: in run() - task 0e448fcc-3ce9-313b-197e-0000000005d6 25201 1726882710.99912: variable 'ansible_search_path' from source: unknown 25201 1726882710.99915: variable 'ansible_search_path' from source: unknown 25201 1726882710.99978: calling self._execute() 25201 1726882711.00284: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.00291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.00303: variable 'omit' from source: magic vars 25201 1726882711.00704: variable 'ansible_distribution_major_version' from source: facts 25201 1726882711.00719: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882711.00939: variable 'type' from source: play vars 25201 1726882711.00943: variable 'state' from source: include params 25201 1726882711.00948: variable 'interface' from source: play vars 25201 1726882711.00952: variable 'current_interfaces' from source: set_fact 25201 1726882711.00961: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25201 1726882711.00967: when evaluation is False, skipping this task 25201 1726882711.00970: _execute() done 25201 1726882711.00973: dumping result to json 25201 1726882711.00976: done dumping result, returning 25201 1726882711.00978: done running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 [0e448fcc-3ce9-313b-197e-0000000005d6] 25201 1726882711.00986: sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d6 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25201 1726882711.01135: no more pending results, returning what we have 25201 1726882711.01139: results queue empty 25201 1726882711.01143: checking for any_errors_fatal 25201 1726882711.01150: done checking for any_errors_fatal 25201 1726882711.01151: checking for max_fail_percentage 25201 1726882711.01152: done checking for max_fail_percentage 25201 1726882711.01153: checking to see if all hosts have failed and the running result is not ok 25201 1726882711.01155: done checking to see if all hosts have failed 25201 1726882711.01155: getting the remaining hosts for this loop 25201 1726882711.01157: done getting the remaining hosts for this loop 25201 1726882711.01161: getting the next task for host managed_node2 25201 1726882711.01173: done getting next task for host managed_node2 25201 1726882711.01176: ^ task is: TASK: Clean up namespace 25201 1726882711.01179: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882711.01184: getting variables 25201 1726882711.01186: in VariableManager get_vars() 25201 1726882711.01230: Calling all_inventory to load vars for managed_node2 25201 1726882711.01233: Calling groups_inventory to load vars for managed_node2 25201 1726882711.01236: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882711.01251: Calling all_plugins_play to load vars for managed_node2 25201 1726882711.01255: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882711.01259: Calling groups_plugins_play to load vars for managed_node2 25201 1726882711.01793: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000005d6 25201 1726882711.01796: WORKER PROCESS EXITING 25201 1726882711.02960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882711.05232: done with get_vars() 25201 1726882711.05254: done getting variables 25201 1726882711.05424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Friday 20 September 2024 21:38:31 -0400 (0:00:00.070) 0:00:32.229 ****** 25201 1726882711.05451: entering _queue_task() for managed_node2/command 25201 1726882711.06121: worker is 1 (out of 1 available) 25201 1726882711.06139: exiting _queue_task() for managed_node2/command 25201 1726882711.06151: done queuing things up, now waiting for results queue to drain 25201 1726882711.06152: waiting for pending results... 25201 1726882711.07081: running TaskExecutor() for managed_node2/TASK: Clean up namespace 25201 1726882711.07273: in run() - task 0e448fcc-3ce9-313b-197e-0000000000b4 25201 1726882711.07286: variable 'ansible_search_path' from source: unknown 25201 1726882711.07321: calling self._execute() 25201 1726882711.07534: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.07538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.07688: variable 'omit' from source: magic vars 25201 1726882711.08524: variable 'ansible_distribution_major_version' from source: facts 25201 1726882711.08587: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882711.08592: variable 'omit' from source: magic vars 25201 1726882711.08613: variable 'omit' from source: magic vars 25201 1726882711.08766: variable 'omit' from source: magic vars 25201 1726882711.08809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882711.08843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882711.08939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882711.08958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882711.08976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882711.09010: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882711.09013: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.09016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.09132: Set connection var ansible_shell_executable to /bin/sh 25201 1726882711.09136: Set connection var ansible_pipelining to False 25201 1726882711.09142: Set connection var ansible_connection to ssh 25201 1726882711.09147: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882711.09149: Set connection var ansible_shell_type to sh 25201 1726882711.09202: Set connection var ansible_timeout to 10 25201 1726882711.09208: variable 'ansible_shell_executable' from source: unknown 25201 1726882711.09211: variable 'ansible_connection' from source: unknown 25201 1726882711.09213: variable 'ansible_module_compression' from source: unknown 25201 1726882711.09216: variable 'ansible_shell_type' from source: unknown 25201 1726882711.09218: variable 'ansible_shell_executable' from source: unknown 25201 1726882711.09220: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.09222: variable 'ansible_pipelining' from source: unknown 25201 1726882711.09224: variable 'ansible_timeout' from source: unknown 25201 1726882711.09225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.09350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882711.09361: variable 'omit' from source: magic vars 25201 1726882711.09368: starting attempt loop 25201 1726882711.09370: running the handler 25201 1726882711.09386: _low_level_execute_command(): starting 25201 1726882711.09393: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882711.10287: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882711.10308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.10318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.10387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.10432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.10439: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882711.10448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.10466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882711.10510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882711.10519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882711.10529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.10543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.10555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.10567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.10573: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882711.10584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.10703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.10772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.10776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.10980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.12616: stdout chunk (state=3): >>>/root <<< 25201 1726882711.12779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.12785: stdout chunk (state=3): >>><<< 25201 1726882711.12793: stderr chunk (state=3): >>><<< 25201 1726882711.12820: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.12830: _low_level_execute_command(): starting 25201 1726882711.12836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092 `" && echo ansible-tmp-1726882711.1281736-26611-109651500392092="` echo /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092 `" ) && sleep 0' 25201 1726882711.13996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.14003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.14409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.14415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.14428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.14441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.14521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.14524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.14537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.14660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.16523: stdout chunk (state=3): >>>ansible-tmp-1726882711.1281736-26611-109651500392092=/root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092 <<< 25201 1726882711.16687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.16690: stderr chunk (state=3): >>><<< 25201 1726882711.16697: stdout chunk (state=3): >>><<< 25201 1726882711.16715: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882711.1281736-26611-109651500392092=/root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.16748: variable 'ansible_module_compression' from source: unknown 25201 1726882711.16803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882711.16835: variable 'ansible_facts' from source: unknown 25201 1726882711.16908: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/AnsiballZ_command.py 25201 1726882711.17519: Sending initial data 25201 1726882711.17522: Sent initial data (156 bytes) 25201 1726882711.20540: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.20545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.20592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.20599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.20614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 25201 1726882711.20619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.20807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.20823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.20826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.20949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.22721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882711.22821: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882711.22916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpl20qajrn /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/AnsiballZ_command.py <<< 25201 1726882711.23011: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882711.24580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.24671: stderr chunk (state=3): >>><<< 25201 1726882711.24675: stdout chunk (state=3): >>><<< 25201 1726882711.24784: done transferring module to remote 25201 1726882711.24788: _low_level_execute_command(): starting 25201 1726882711.24790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/ /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/AnsiballZ_command.py && sleep 0' 25201 1726882711.27646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.27650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.27689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.27692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.27699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.27990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.28076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.28279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.30023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.30090: stderr chunk (state=3): >>><<< 25201 1726882711.30094: stdout chunk (state=3): >>><<< 25201 1726882711.30198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.30202: _low_level_execute_command(): starting 25201 1726882711.30204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/AnsiballZ_command.py && sleep 0' 25201 1726882711.31778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882711.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.31805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.31823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.31879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.31892: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882711.31906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.31923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882711.31936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882711.31954: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882711.31969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.31985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.32000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.32013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.32025: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882711.32043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.32121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.32143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.32173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.32313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.45983: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:38:31.452726", "end": "2024-09-20 21:38:31.457852", "delta": "0:00:00.005126", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882711.47286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882711.47291: stdout chunk (state=3): >>><<< 25201 1726882711.47294: stderr chunk (state=3): >>><<< 25201 1726882711.47371: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:38:31.452726", "end": "2024-09-20 21:38:31.457852", "delta": "0:00:00.005126", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882711.47375: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882711.47378: _low_level_execute_command(): starting 25201 1726882711.47449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882711.1281736-26611-109651500392092/ > /dev/null 2>&1 && sleep 0' 25201 1726882711.48732: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882711.48750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.48771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.48790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.48833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.48845: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882711.48865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.48884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882711.48902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882711.48914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882711.48927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.48945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.48961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.48980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.48992: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882711.49006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.49087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.49112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882711.49131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.49259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.51171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.51174: stdout chunk (state=3): >>><<< 25201 1726882711.51177: stderr chunk (state=3): >>><<< 25201 1726882711.51179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.51181: handler run complete 25201 1726882711.51271: Evaluated conditional (False): False 25201 1726882711.51275: attempt loop complete, returning result 25201 1726882711.51277: _execute() done 25201 1726882711.51279: dumping result to json 25201 1726882711.51281: done dumping result, returning 25201 1726882711.51283: done running TaskExecutor() for managed_node2/TASK: Clean up namespace [0e448fcc-3ce9-313b-197e-0000000000b4] 25201 1726882711.51285: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b4 25201 1726882711.51443: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b4 25201 1726882711.51447: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.005126", "end": "2024-09-20 21:38:31.457852", "rc": 0, "start": "2024-09-20 21:38:31.452726" } 25201 1726882711.51526: no more pending results, returning what we have 25201 1726882711.51529: results queue empty 25201 1726882711.51530: checking for any_errors_fatal 25201 1726882711.51537: done checking for any_errors_fatal 25201 1726882711.51537: checking for max_fail_percentage 25201 1726882711.51539: done checking for max_fail_percentage 25201 1726882711.51540: checking to see if all hosts have failed and the running result is not ok 25201 1726882711.51541: done checking to see if all hosts have failed 25201 1726882711.51542: getting the remaining hosts for this loop 25201 1726882711.51543: done getting the remaining hosts for this loop 25201 1726882711.51547: getting the next task for host managed_node2 25201 1726882711.51553: done getting next task for host managed_node2 25201 1726882711.51555: ^ task is: TASK: Verify network state restored to default 25201 1726882711.51557: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882711.51561: getting variables 25201 1726882711.51565: in VariableManager get_vars() 25201 1726882711.51602: Calling all_inventory to load vars for managed_node2 25201 1726882711.51605: Calling groups_inventory to load vars for managed_node2 25201 1726882711.51607: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882711.51616: Calling all_plugins_play to load vars for managed_node2 25201 1726882711.51619: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882711.51621: Calling groups_plugins_play to load vars for managed_node2 25201 1726882711.53947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882711.57225: done with get_vars() 25201 1726882711.57243: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Friday 20 September 2024 21:38:31 -0400 (0:00:00.518) 0:00:32.747 ****** 25201 1726882711.57316: entering _queue_task() for managed_node2/include_tasks 25201 1726882711.57548: worker is 1 (out of 1 available) 25201 1726882711.57560: exiting _queue_task() for managed_node2/include_tasks 25201 1726882711.57574: done queuing things up, now waiting for results queue to drain 25201 1726882711.57576: waiting for pending results... 25201 1726882711.57860: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 25201 1726882711.57988: in run() - task 0e448fcc-3ce9-313b-197e-0000000000b5 25201 1726882711.58024: variable 'ansible_search_path' from source: unknown 25201 1726882711.58062: calling self._execute() 25201 1726882711.58152: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.58158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.58196: variable 'omit' from source: magic vars 25201 1726882711.58547: variable 'ansible_distribution_major_version' from source: facts 25201 1726882711.58558: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882711.58565: _execute() done 25201 1726882711.58570: dumping result to json 25201 1726882711.58573: done dumping result, returning 25201 1726882711.58583: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0e448fcc-3ce9-313b-197e-0000000000b5] 25201 1726882711.58586: sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b5 25201 1726882711.58677: done sending task result for task 0e448fcc-3ce9-313b-197e-0000000000b5 25201 1726882711.58680: WORKER PROCESS EXITING 25201 1726882711.58732: no more pending results, returning what we have 25201 1726882711.58774: in VariableManager get_vars() 25201 1726882711.58950: Calling all_inventory to load vars for managed_node2 25201 1726882711.58952: Calling groups_inventory to load vars for managed_node2 25201 1726882711.58954: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882711.58970: Calling all_plugins_play to load vars for managed_node2 25201 1726882711.58974: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882711.58978: Calling groups_plugins_play to load vars for managed_node2 25201 1726882711.60558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882711.62147: done with get_vars() 25201 1726882711.62160: variable 'ansible_search_path' from source: unknown 25201 1726882711.62178: we have included files to process 25201 1726882711.62179: generating all_blocks data 25201 1726882711.62181: done generating all_blocks data 25201 1726882711.62186: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25201 1726882711.62190: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25201 1726882711.62194: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25201 1726882711.62471: done processing included file 25201 1726882711.62473: iterating over new_blocks loaded from include file 25201 1726882711.62474: in VariableManager get_vars() 25201 1726882711.62486: done with get_vars() 25201 1726882711.62487: filtering new block on tags 25201 1726882711.62499: done filtering new block on tags 25201 1726882711.62500: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 25201 1726882711.62503: extending task lists for all hosts with included blocks 25201 1726882711.64195: done extending task lists 25201 1726882711.64196: done processing included files 25201 1726882711.64197: results queue empty 25201 1726882711.64198: checking for any_errors_fatal 25201 1726882711.64202: done checking for any_errors_fatal 25201 1726882711.64203: checking for max_fail_percentage 25201 1726882711.64204: done checking for max_fail_percentage 25201 1726882711.64205: checking to see if all hosts have failed and the running result is not ok 25201 1726882711.64205: done checking to see if all hosts have failed 25201 1726882711.64206: getting the remaining hosts for this loop 25201 1726882711.64207: done getting the remaining hosts for this loop 25201 1726882711.64210: getting the next task for host managed_node2 25201 1726882711.64213: done getting next task for host managed_node2 25201 1726882711.64215: ^ task is: TASK: Check routes and DNS 25201 1726882711.64217: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882711.64219: getting variables 25201 1726882711.64220: in VariableManager get_vars() 25201 1726882711.64234: Calling all_inventory to load vars for managed_node2 25201 1726882711.64236: Calling groups_inventory to load vars for managed_node2 25201 1726882711.64238: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882711.64243: Calling all_plugins_play to load vars for managed_node2 25201 1726882711.64246: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882711.64249: Calling groups_plugins_play to load vars for managed_node2 25201 1726882711.65545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882711.67091: done with get_vars() 25201 1726882711.67104: done getting variables 25201 1726882711.67133: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:38:31 -0400 (0:00:00.098) 0:00:32.846 ****** 25201 1726882711.67153: entering _queue_task() for managed_node2/shell 25201 1726882711.67430: worker is 1 (out of 1 available) 25201 1726882711.67441: exiting _queue_task() for managed_node2/shell 25201 1726882711.67453: done queuing things up, now waiting for results queue to drain 25201 1726882711.67455: waiting for pending results... 25201 1726882711.67741: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 25201 1726882711.67813: in run() - task 0e448fcc-3ce9-313b-197e-00000000075e 25201 1726882711.67823: variable 'ansible_search_path' from source: unknown 25201 1726882711.67826: variable 'ansible_search_path' from source: unknown 25201 1726882711.67881: calling self._execute() 25201 1726882711.67938: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.67941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.67950: variable 'omit' from source: magic vars 25201 1726882711.68243: variable 'ansible_distribution_major_version' from source: facts 25201 1726882711.68253: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882711.68259: variable 'omit' from source: magic vars 25201 1726882711.68290: variable 'omit' from source: magic vars 25201 1726882711.68316: variable 'omit' from source: magic vars 25201 1726882711.68348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882711.68379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882711.68398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882711.68416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882711.68422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882711.68447: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882711.68451: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.68454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.68525: Set connection var ansible_shell_executable to /bin/sh 25201 1726882711.68529: Set connection var ansible_pipelining to False 25201 1726882711.68536: Set connection var ansible_connection to ssh 25201 1726882711.68541: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882711.68544: Set connection var ansible_shell_type to sh 25201 1726882711.68549: Set connection var ansible_timeout to 10 25201 1726882711.68569: variable 'ansible_shell_executable' from source: unknown 25201 1726882711.68573: variable 'ansible_connection' from source: unknown 25201 1726882711.68577: variable 'ansible_module_compression' from source: unknown 25201 1726882711.68580: variable 'ansible_shell_type' from source: unknown 25201 1726882711.68583: variable 'ansible_shell_executable' from source: unknown 25201 1726882711.68585: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882711.68587: variable 'ansible_pipelining' from source: unknown 25201 1726882711.68589: variable 'ansible_timeout' from source: unknown 25201 1726882711.68592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882711.68696: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882711.68706: variable 'omit' from source: magic vars 25201 1726882711.68708: starting attempt loop 25201 1726882711.68711: running the handler 25201 1726882711.68720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882711.68763: _low_level_execute_command(): starting 25201 1726882711.68768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882711.69251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.69270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.69285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 25201 1726882711.69308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.69348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.69360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.69508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.71159: stdout chunk (state=3): >>>/root <<< 25201 1726882711.71269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.71309: stderr chunk (state=3): >>><<< 25201 1726882711.71312: stdout chunk (state=3): >>><<< 25201 1726882711.71329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.71339: _low_level_execute_command(): starting 25201 1726882711.71349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604 `" && echo ansible-tmp-1726882711.713284-26647-209844487603604="` echo /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604 `" ) && sleep 0' 25201 1726882711.71880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882711.71886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.71896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.71905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.71939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.72006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.72059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.72153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.74044: stdout chunk (state=3): >>>ansible-tmp-1726882711.713284-26647-209844487603604=/root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604 <<< 25201 1726882711.74162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.74250: stderr chunk (state=3): >>><<< 25201 1726882711.74253: stdout chunk (state=3): >>><<< 25201 1726882711.74271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882711.713284-26647-209844487603604=/root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.74297: variable 'ansible_module_compression' from source: unknown 25201 1726882711.74343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882711.74383: variable 'ansible_facts' from source: unknown 25201 1726882711.74456: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/AnsiballZ_command.py 25201 1726882711.74625: Sending initial data 25201 1726882711.74629: Sent initial data (155 bytes) 25201 1726882711.75420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.75423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.75458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.75462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.75471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.75519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.75523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.75628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.77365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882711.77460: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882711.77558: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmpauejlwzy /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/AnsiballZ_command.py <<< 25201 1726882711.77667: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882711.78980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.79071: stderr chunk (state=3): >>><<< 25201 1726882711.79075: stdout chunk (state=3): >>><<< 25201 1726882711.79176: done transferring module to remote 25201 1726882711.79179: _low_level_execute_command(): starting 25201 1726882711.79181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/ /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/AnsiballZ_command.py && sleep 0' 25201 1726882711.79728: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882711.79738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882711.79759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882711.79801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.79804: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.79807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882711.79821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882711.79826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882711.79833: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882711.79843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.79900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882711.79918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.80032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.81826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882711.81875: stderr chunk (state=3): >>><<< 25201 1726882711.81881: stdout chunk (state=3): >>><<< 25201 1726882711.81895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882711.81898: _low_level_execute_command(): starting 25201 1726882711.81903: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/AnsiballZ_command.py && sleep 0' 25201 1726882711.82583: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882711.82781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882711.83001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882711.96594: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3152sec preferred_lft 3152sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:31.955104", "end": "2024-09-20 21:38:31.963704", "delta": "0:00:00.008600", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882711.97852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882711.97857: stderr chunk (state=3): >>><<< 25201 1726882711.97859: stdout chunk (state=3): >>><<< 25201 1726882711.97893: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3152sec preferred_lft 3152sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:31.955104", "end": "2024-09-20 21:38:31.963704", "delta": "0:00:00.008600", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882711.97940: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882711.97948: _low_level_execute_command(): starting 25201 1726882711.97953: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882711.713284-26647-209844487603604/ > /dev/null 2>&1 && sleep 0' 25201 1726882712.00658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.00662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.00716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.00720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.00738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.00744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.00828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.00846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.00849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.00984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.02853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.02857: stderr chunk (state=3): >>><<< 25201 1726882712.02862: stdout chunk (state=3): >>><<< 25201 1726882712.02885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882712.02891: handler run complete 25201 1726882712.02918: Evaluated conditional (False): False 25201 1726882712.02929: attempt loop complete, returning result 25201 1726882712.02932: _execute() done 25201 1726882712.02934: dumping result to json 25201 1726882712.02940: done dumping result, returning 25201 1726882712.02949: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-313b-197e-00000000075e] 25201 1726882712.02955: sending task result for task 0e448fcc-3ce9-313b-197e-00000000075e 25201 1726882712.03068: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000075e 25201 1726882712.03071: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008600", "end": "2024-09-20 21:38:31.963704", "rc": 0, "start": "2024-09-20 21:38:31.955104" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3152sec preferred_lft 3152sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 25201 1726882712.03160: no more pending results, returning what we have 25201 1726882712.03167: results queue empty 25201 1726882712.03168: checking for any_errors_fatal 25201 1726882712.03169: done checking for any_errors_fatal 25201 1726882712.03170: checking for max_fail_percentage 25201 1726882712.03171: done checking for max_fail_percentage 25201 1726882712.03172: checking to see if all hosts have failed and the running result is not ok 25201 1726882712.03173: done checking to see if all hosts have failed 25201 1726882712.03174: getting the remaining hosts for this loop 25201 1726882712.03176: done getting the remaining hosts for this loop 25201 1726882712.03179: getting the next task for host managed_node2 25201 1726882712.03187: done getting next task for host managed_node2 25201 1726882712.03189: ^ task is: TASK: Verify DNS and network connectivity 25201 1726882712.03192: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25201 1726882712.03195: getting variables 25201 1726882712.03197: in VariableManager get_vars() 25201 1726882712.03236: Calling all_inventory to load vars for managed_node2 25201 1726882712.03239: Calling groups_inventory to load vars for managed_node2 25201 1726882712.03241: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882712.03250: Calling all_plugins_play to load vars for managed_node2 25201 1726882712.03252: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882712.03254: Calling groups_plugins_play to load vars for managed_node2 25201 1726882712.05787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882712.10370: done with get_vars() 25201 1726882712.10517: done getting variables 25201 1726882712.10686: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:38:32 -0400 (0:00:00.435) 0:00:33.281 ****** 25201 1726882712.10834: entering _queue_task() for managed_node2/shell 25201 1726882712.11508: worker is 1 (out of 1 available) 25201 1726882712.11520: exiting _queue_task() for managed_node2/shell 25201 1726882712.11531: done queuing things up, now waiting for results queue to drain 25201 1726882712.11532: waiting for pending results... 25201 1726882712.12485: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 25201 1726882712.12708: in run() - task 0e448fcc-3ce9-313b-197e-00000000075f 25201 1726882712.12728: variable 'ansible_search_path' from source: unknown 25201 1726882712.12735: variable 'ansible_search_path' from source: unknown 25201 1726882712.12776: calling self._execute() 25201 1726882712.12893: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882712.13022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882712.13038: variable 'omit' from source: magic vars 25201 1726882712.13817: variable 'ansible_distribution_major_version' from source: facts 25201 1726882712.13836: Evaluated conditional (ansible_distribution_major_version != '6'): True 25201 1726882712.14218: variable 'ansible_facts' from source: unknown 25201 1726882712.15855: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 25201 1726882712.15953: variable 'omit' from source: magic vars 25201 1726882712.15997: variable 'omit' from source: magic vars 25201 1726882712.16027: variable 'omit' from source: magic vars 25201 1726882712.16184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25201 1726882712.16222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25201 1726882712.16248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25201 1726882712.16290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882712.16388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25201 1726882712.16421: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25201 1726882712.16432: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882712.16440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882712.16653: Set connection var ansible_shell_executable to /bin/sh 25201 1726882712.16710: Set connection var ansible_pipelining to False 25201 1726882712.16722: Set connection var ansible_connection to ssh 25201 1726882712.16820: Set connection var ansible_module_compression to ZIP_DEFLATED 25201 1726882712.16828: Set connection var ansible_shell_type to sh 25201 1726882712.16841: Set connection var ansible_timeout to 10 25201 1726882712.16874: variable 'ansible_shell_executable' from source: unknown 25201 1726882712.16882: variable 'ansible_connection' from source: unknown 25201 1726882712.16890: variable 'ansible_module_compression' from source: unknown 25201 1726882712.16896: variable 'ansible_shell_type' from source: unknown 25201 1726882712.16902: variable 'ansible_shell_executable' from source: unknown 25201 1726882712.16910: variable 'ansible_host' from source: host vars for 'managed_node2' 25201 1726882712.16922: variable 'ansible_pipelining' from source: unknown 25201 1726882712.17034: variable 'ansible_timeout' from source: unknown 25201 1726882712.17047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25201 1726882712.17225: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882712.17371: variable 'omit' from source: magic vars 25201 1726882712.17381: starting attempt loop 25201 1726882712.17387: running the handler 25201 1726882712.17403: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 25201 1726882712.17427: _low_level_execute_command(): starting 25201 1726882712.17439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25201 1726882712.19450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882712.19467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.19549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.19573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.19618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.19631: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.19653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.19677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.19690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.19759: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.19779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.19794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.19809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.19822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.19834: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.19848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.19927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.19990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.20006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.20209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.21867: stdout chunk (state=3): >>>/root <<< 25201 1726882712.22074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.22077: stdout chunk (state=3): >>><<< 25201 1726882712.22080: stderr chunk (state=3): >>><<< 25201 1726882712.22204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882712.22214: _low_level_execute_command(): starting 25201 1726882712.22218: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415 `" && echo ansible-tmp-1726882712.2210233-26668-181746863250415="` echo /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415 `" ) && sleep 0' 25201 1726882712.24470: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.24474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.24703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.24769: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.24783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.24797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.24804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.24811: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.24818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.24830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.24838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.25086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.25094: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.25104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.25177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.25193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.25204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.25418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.27304: stdout chunk (state=3): >>>ansible-tmp-1726882712.2210233-26668-181746863250415=/root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415 <<< 25201 1726882712.27484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.27487: stdout chunk (state=3): >>><<< 25201 1726882712.27495: stderr chunk (state=3): >>><<< 25201 1726882712.27585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882712.2210233-26668-181746863250415=/root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882712.27589: variable 'ansible_module_compression' from source: unknown 25201 1726882712.27605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25201fmfeipqk/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25201 1726882712.27639: variable 'ansible_facts' from source: unknown 25201 1726882712.27728: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/AnsiballZ_command.py 25201 1726882712.28607: Sending initial data 25201 1726882712.28610: Sent initial data (156 bytes) 25201 1726882712.31471: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882712.31476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.31479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.31481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.31483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.31486: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.31488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.31490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.31527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.31532: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.31541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.31550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.31568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.31578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.31585: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.31595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.31674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.31686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.31748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.31961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.33748: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25201 1726882712.33843: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 25201 1726882712.33944: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25201fmfeipqk/tmptamka4ci /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/AnsiballZ_command.py <<< 25201 1726882712.34040: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 25201 1726882712.35735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.35960: stderr chunk (state=3): >>><<< 25201 1726882712.35964: stdout chunk (state=3): >>><<< 25201 1726882712.35995: done transferring module to remote 25201 1726882712.36006: _low_level_execute_command(): starting 25201 1726882712.36011: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/ /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/AnsiballZ_command.py && sleep 0' 25201 1726882712.37645: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882712.37659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.37731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.37748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.37790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.37803: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.37819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.37846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.37947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.37965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.37983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.38000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.38017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.38029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.38051: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.38074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.38150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.38191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.38206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.38337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.40189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.40253: stderr chunk (state=3): >>><<< 25201 1726882712.40256: stdout chunk (state=3): >>><<< 25201 1726882712.40350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882712.40354: _low_level_execute_command(): starting 25201 1726882712.40357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/AnsiballZ_command.py && sleep 0' 25201 1726882712.41195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882712.41210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.41225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.41365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.41407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.41421: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.41435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.41461: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.41479: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.41491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.41504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.41518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.41534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.41546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.41558: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.41584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.41658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.41799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.41817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.42001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.82744: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11296 0 --:--:-- --:--:-- --:--:-- 11730\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1299 0 --:--:-- --:--:-- --:--:-- 1299", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:32.550592", "end": "2024-09-20 21:38:32.825247", "delta": "0:00:00.274655", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25201 1726882712.84101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 25201 1726882712.84203: stderr chunk (state=3): >>><<< 25201 1726882712.84207: stdout chunk (state=3): >>><<< 25201 1726882712.84252: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11296 0 --:--:-- --:--:-- --:--:-- 11730\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1299 0 --:--:-- --:--:-- --:--:-- 1299", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:32.550592", "end": "2024-09-20 21:38:32.825247", "delta": "0:00:00.274655", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 25201 1726882712.84381: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25201 1726882712.84385: _low_level_execute_command(): starting 25201 1726882712.84387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882712.2210233-26668-181746863250415/ > /dev/null 2>&1 && sleep 0' 25201 1726882712.86027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25201 1726882712.86041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.86057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.86077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.86144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.86160: stderr chunk (state=3): >>>debug2: match not found <<< 25201 1726882712.86178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.86218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25201 1726882712.86234: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 25201 1726882712.86246: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25201 1726882712.86258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25201 1726882712.86275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25201 1726882712.86290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25201 1726882712.86301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 25201 1726882712.86311: stderr chunk (state=3): >>>debug2: match found <<< 25201 1726882712.86325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25201 1726882712.86404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 25201 1726882712.86424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25201 1726882712.86442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25201 1726882712.86578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25201 1726882712.88389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25201 1726882712.88452: stderr chunk (state=3): >>><<< 25201 1726882712.88454: stdout chunk (state=3): >>><<< 25201 1726882712.88468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25201 1726882712.88476: handler run complete 25201 1726882712.88493: Evaluated conditional (False): False 25201 1726882712.88502: attempt loop complete, returning result 25201 1726882712.88504: _execute() done 25201 1726882712.88507: dumping result to json 25201 1726882712.88514: done dumping result, returning 25201 1726882712.88521: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-313b-197e-00000000075f] 25201 1726882712.88527: sending task result for task 0e448fcc-3ce9-313b-197e-00000000075f 25201 1726882712.88629: done sending task result for task 0e448fcc-3ce9-313b-197e-00000000075f 25201 1726882712.88631: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.274655", "end": "2024-09-20 21:38:32.825247", "rc": 0, "start": "2024-09-20 21:38:32.550592" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 11296 0 --:--:-- --:--:-- --:--:-- 11730 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1299 0 --:--:-- --:--:-- --:--:-- 1299 25201 1726882712.88699: no more pending results, returning what we have 25201 1726882712.88703: results queue empty 25201 1726882712.88704: checking for any_errors_fatal 25201 1726882712.88715: done checking for any_errors_fatal 25201 1726882712.88715: checking for max_fail_percentage 25201 1726882712.88717: done checking for max_fail_percentage 25201 1726882712.88718: checking to see if all hosts have failed and the running result is not ok 25201 1726882712.88719: done checking to see if all hosts have failed 25201 1726882712.88719: getting the remaining hosts for this loop 25201 1726882712.88721: done getting the remaining hosts for this loop 25201 1726882712.88725: getting the next task for host managed_node2 25201 1726882712.88733: done getting next task for host managed_node2 25201 1726882712.88734: ^ task is: TASK: meta (flush_handlers) 25201 1726882712.88736: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882712.88740: getting variables 25201 1726882712.88742: in VariableManager get_vars() 25201 1726882712.88783: Calling all_inventory to load vars for managed_node2 25201 1726882712.88786: Calling groups_inventory to load vars for managed_node2 25201 1726882712.88788: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882712.88798: Calling all_plugins_play to load vars for managed_node2 25201 1726882712.88800: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882712.88802: Calling groups_plugins_play to load vars for managed_node2 25201 1726882712.89988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882712.92525: done with get_vars() 25201 1726882712.92542: done getting variables 25201 1726882712.92611: in VariableManager get_vars() 25201 1726882712.92627: Calling all_inventory to load vars for managed_node2 25201 1726882712.92629: Calling groups_inventory to load vars for managed_node2 25201 1726882712.92632: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882712.92637: Calling all_plugins_play to load vars for managed_node2 25201 1726882712.92639: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882712.92641: Calling groups_plugins_play to load vars for managed_node2 25201 1726882712.93917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882712.95615: done with get_vars() 25201 1726882712.95648: done queuing things up, now waiting for results queue to drain 25201 1726882712.95650: results queue empty 25201 1726882712.95651: checking for any_errors_fatal 25201 1726882712.95654: done checking for any_errors_fatal 25201 1726882712.95655: checking for max_fail_percentage 25201 1726882712.95656: done checking for max_fail_percentage 25201 1726882712.95657: checking to see if all hosts have failed and the running result is not ok 25201 1726882712.95657: done checking to see if all hosts have failed 25201 1726882712.95658: getting the remaining hosts for this loop 25201 1726882712.95659: done getting the remaining hosts for this loop 25201 1726882712.95661: getting the next task for host managed_node2 25201 1726882712.95668: done getting next task for host managed_node2 25201 1726882712.95671: ^ task is: TASK: meta (flush_handlers) 25201 1726882712.95672: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882712.95678: getting variables 25201 1726882712.95679: in VariableManager get_vars() 25201 1726882712.95697: Calling all_inventory to load vars for managed_node2 25201 1726882712.95700: Calling groups_inventory to load vars for managed_node2 25201 1726882712.95702: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882712.95706: Calling all_plugins_play to load vars for managed_node2 25201 1726882712.95709: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882712.95711: Calling groups_plugins_play to load vars for managed_node2 25201 1726882712.97335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882713.00026: done with get_vars() 25201 1726882713.00048: done getting variables 25201 1726882713.00101: in VariableManager get_vars() 25201 1726882713.00115: Calling all_inventory to load vars for managed_node2 25201 1726882713.00121: Calling groups_inventory to load vars for managed_node2 25201 1726882713.00123: Calling all_plugins_inventory to load vars for managed_node2 25201 1726882713.00128: Calling all_plugins_play to load vars for managed_node2 25201 1726882713.00130: Calling groups_plugins_inventory to load vars for managed_node2 25201 1726882713.00132: Calling groups_plugins_play to load vars for managed_node2 25201 1726882713.01441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25201 1726882713.03685: done with get_vars() 25201 1726882713.03710: done queuing things up, now waiting for results queue to drain 25201 1726882713.03716: results queue empty 25201 1726882713.03717: checking for any_errors_fatal 25201 1726882713.03719: done checking for any_errors_fatal 25201 1726882713.03719: checking for max_fail_percentage 25201 1726882713.03720: done checking for max_fail_percentage 25201 1726882713.03721: checking to see if all hosts have failed and the running result is not ok 25201 1726882713.03722: done checking to see if all hosts have failed 25201 1726882713.03723: getting the remaining hosts for this loop 25201 1726882713.03724: done getting the remaining hosts for this loop 25201 1726882713.03726: getting the next task for host managed_node2 25201 1726882713.03736: done getting next task for host managed_node2 25201 1726882713.03737: ^ task is: None 25201 1726882713.03738: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25201 1726882713.03739: done queuing things up, now waiting for results queue to drain 25201 1726882713.03740: results queue empty 25201 1726882713.03741: checking for any_errors_fatal 25201 1726882713.03742: done checking for any_errors_fatal 25201 1726882713.03742: checking for max_fail_percentage 25201 1726882713.03743: done checking for max_fail_percentage 25201 1726882713.03744: checking to see if all hosts have failed and the running result is not ok 25201 1726882713.03745: done checking to see if all hosts have failed 25201 1726882713.03761: getting the next task for host managed_node2 25201 1726882713.03770: done getting next task for host managed_node2 25201 1726882713.03771: ^ task is: None 25201 1726882713.03776: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=76 changed=2 unreachable=0 failed=0 skipped=62 rescued=0 ignored=0 Friday 20 September 2024 21:38:33 -0400 (0:00:00.931) 0:00:34.213 ****** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.36s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Install iproute --------------------------------------------------------- 1.50s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.44s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Ensure ping6 command is present ----------------------------------------- 1.39s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Create veth interface veth0 --------------------------------------------- 1.16s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.12s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Verify DNS and network connectivity ------------------------------------- 0.93s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.72s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather current interface info ------------------------------------------- 0.61s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.56s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Clean up namespace ------------------------------------------------------ 0.52s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Get ip address information ---------------------------------------------- 0.51s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 25201 1726882713.03984: RUNNING CLEANUP