[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 19110 1726882542.85203: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 19110 1726882542.85484: Added group all to inventory 19110 1726882542.85487: Added group ungrouped to inventory 19110 1726882542.85490: Group all now contains ungrouped 19110 1726882542.85492: Examining possible inventory source: /tmp/network-91m/inventory.yml 19110 1726882542.94248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 19110 1726882542.94292: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 19110 1726882542.94308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 19110 1726882542.94345: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 19110 1726882542.94396: Loaded config def from plugin (inventory/script) 19110 1726882542.94398: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 19110 1726882542.94425: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 19110 1726882542.94483: Loaded config def from plugin (inventory/yaml) 19110 1726882542.94485: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 19110 1726882542.94541: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 19110 1726882542.94811: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 19110 1726882542.94814: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 19110 1726882542.94816: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 19110 1726882542.94821: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 19110 1726882542.94824: Loading data from /tmp/network-91m/inventory.yml 19110 1726882542.94867: /tmp/network-91m/inventory.yml was not parsable by auto 19110 1726882542.94912: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 19110 1726882542.94939: Loading data from /tmp/network-91m/inventory.yml 19110 1726882542.94993: group all already in inventory 19110 1726882542.94997: set inventory_file for managed_node1 19110 1726882542.95001: set inventory_dir for managed_node1 19110 1726882542.95002: Added host managed_node1 to inventory 19110 1726882542.95004: Added host managed_node1 to group all 19110 1726882542.95005: set ansible_host for managed_node1 19110 1726882542.95006: set ansible_ssh_extra_args for managed_node1 19110 1726882542.95008: set inventory_file for managed_node2 19110 1726882542.95010: set inventory_dir for managed_node2 19110 1726882542.95011: Added host managed_node2 to inventory 19110 1726882542.95012: Added host managed_node2 to group all 19110 1726882542.95012: set ansible_host for managed_node2 19110 1726882542.95013: set ansible_ssh_extra_args for managed_node2 19110 1726882542.95014: set inventory_file for managed_node3 19110 1726882542.95016: set inventory_dir for managed_node3 19110 1726882542.95016: Added host managed_node3 to inventory 19110 1726882542.95017: Added host managed_node3 to group all 19110 1726882542.95017: set ansible_host for managed_node3 19110 1726882542.95018: set ansible_ssh_extra_args for managed_node3 19110 1726882542.95020: Reconcile groups and hosts in inventory. 19110 1726882542.95022: Group ungrouped now contains managed_node1 19110 1726882542.95023: Group ungrouped now contains managed_node2 19110 1726882542.95024: Group ungrouped now contains managed_node3 19110 1726882542.95079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 19110 1726882542.95159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 19110 1726882542.95190: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 19110 1726882542.95208: Loaded config def from plugin (vars/host_group_vars) 19110 1726882542.95209: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 19110 1726882542.95214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 19110 1726882542.95219: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 19110 1726882542.95249: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 19110 1726882542.95477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882542.95536: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 19110 1726882542.95563: Loaded config def from plugin (connection/local) 19110 1726882542.95567: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 19110 1726882542.95895: Loaded config def from plugin (connection/paramiko_ssh) 19110 1726882542.95897: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 19110 1726882542.96478: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19110 1726882542.96501: Loaded config def from plugin (connection/psrp) 19110 1726882542.96503: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 19110 1726882542.96960: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19110 1726882542.96985: Loaded config def from plugin (connection/ssh) 19110 1726882542.96987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 19110 1726882542.98248: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19110 1726882542.98292: Loaded config def from plugin (connection/winrm) 19110 1726882542.98294: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 19110 1726882542.98324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 19110 1726882542.98385: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 19110 1726882542.98449: Loaded config def from plugin (shell/cmd) 19110 1726882542.98451: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 19110 1726882542.98476: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 19110 1726882542.98541: Loaded config def from plugin (shell/powershell) 19110 1726882542.98543: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 19110 1726882542.98596: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 19110 1726882542.98778: Loaded config def from plugin (shell/sh) 19110 1726882542.98780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 19110 1726882542.98810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 19110 1726882542.99060: Loaded config def from plugin (become/runas) 19110 1726882542.99065: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 19110 1726882542.99245: Loaded config def from plugin (become/su) 19110 1726882542.99247: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 19110 1726882542.99406: Loaded config def from plugin (become/sudo) 19110 1726882542.99409: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 19110 1726882542.99439: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 19110 1726882542.99707: in VariableManager get_vars() 19110 1726882542.99723: done with get_vars() 19110 1726882542.99813: trying /usr/local/lib/python3.12/site-packages/ansible/modules 19110 1726882543.01815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 19110 1726882543.01921: in VariableManager get_vars() 19110 1726882543.01926: done with get_vars() 19110 1726882543.01929: variable 'playbook_dir' from source: magic vars 19110 1726882543.01930: variable 'ansible_playbook_python' from source: magic vars 19110 1726882543.01930: variable 'ansible_config_file' from source: magic vars 19110 1726882543.01931: variable 'groups' from source: magic vars 19110 1726882543.01932: variable 'omit' from source: magic vars 19110 1726882543.01933: variable 'ansible_version' from source: magic vars 19110 1726882543.01933: variable 'ansible_check_mode' from source: magic vars 19110 1726882543.01934: variable 'ansible_diff_mode' from source: magic vars 19110 1726882543.01935: variable 'ansible_forks' from source: magic vars 19110 1726882543.01936: variable 'ansible_inventory_sources' from source: magic vars 19110 1726882543.01936: variable 'ansible_skip_tags' from source: magic vars 19110 1726882543.01937: variable 'ansible_limit' from source: magic vars 19110 1726882543.01938: variable 'ansible_run_tags' from source: magic vars 19110 1726882543.01939: variable 'ansible_verbosity' from source: magic vars 19110 1726882543.01973: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 19110 1726882543.02588: in VariableManager get_vars() 19110 1726882543.02602: done with get_vars() 19110 1726882543.02639: in VariableManager get_vars() 19110 1726882543.02658: done with get_vars() 19110 1726882543.02695: in VariableManager get_vars() 19110 1726882543.02707: done with get_vars() 19110 1726882543.02734: in VariableManager get_vars() 19110 1726882543.02743: done with get_vars() 19110 1726882543.02807: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19110 1726882543.03005: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19110 1726882543.03145: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19110 1726882543.03812: in VariableManager get_vars() 19110 1726882543.03831: done with get_vars() 19110 1726882543.04229: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 19110 1726882543.04366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882543.05575: in VariableManager get_vars() 19110 1726882543.05594: done with get_vars() 19110 1726882543.05720: in VariableManager get_vars() 19110 1726882543.05724: done with get_vars() 19110 1726882543.05726: variable 'playbook_dir' from source: magic vars 19110 1726882543.05727: variable 'ansible_playbook_python' from source: magic vars 19110 1726882543.05727: variable 'ansible_config_file' from source: magic vars 19110 1726882543.05728: variable 'groups' from source: magic vars 19110 1726882543.05729: variable 'omit' from source: magic vars 19110 1726882543.05730: variable 'ansible_version' from source: magic vars 19110 1726882543.05730: variable 'ansible_check_mode' from source: magic vars 19110 1726882543.05731: variable 'ansible_diff_mode' from source: magic vars 19110 1726882543.05732: variable 'ansible_forks' from source: magic vars 19110 1726882543.05733: variable 'ansible_inventory_sources' from source: magic vars 19110 1726882543.05733: variable 'ansible_skip_tags' from source: magic vars 19110 1726882543.05734: variable 'ansible_limit' from source: magic vars 19110 1726882543.05735: variable 'ansible_run_tags' from source: magic vars 19110 1726882543.05736: variable 'ansible_verbosity' from source: magic vars 19110 1726882543.05768: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 19110 1726882543.05839: in VariableManager get_vars() 19110 1726882543.05842: done with get_vars() 19110 1726882543.05844: variable 'playbook_dir' from source: magic vars 19110 1726882543.05845: variable 'ansible_playbook_python' from source: magic vars 19110 1726882543.05846: variable 'ansible_config_file' from source: magic vars 19110 1726882543.05847: variable 'groups' from source: magic vars 19110 1726882543.05847: variable 'omit' from source: magic vars 19110 1726882543.05848: variable 'ansible_version' from source: magic vars 19110 1726882543.05849: variable 'ansible_check_mode' from source: magic vars 19110 1726882543.05850: variable 'ansible_diff_mode' from source: magic vars 19110 1726882543.05850: variable 'ansible_forks' from source: magic vars 19110 1726882543.05851: variable 'ansible_inventory_sources' from source: magic vars 19110 1726882543.05852: variable 'ansible_skip_tags' from source: magic vars 19110 1726882543.05853: variable 'ansible_limit' from source: magic vars 19110 1726882543.05853: variable 'ansible_run_tags' from source: magic vars 19110 1726882543.05854: variable 'ansible_verbosity' from source: magic vars 19110 1726882543.05886: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 19110 1726882543.05966: in VariableManager get_vars() 19110 1726882543.05977: done with get_vars() 19110 1726882543.06019: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19110 1726882543.06118: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19110 1726882543.06181: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19110 1726882543.06566: in VariableManager get_vars() 19110 1726882543.06586: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882543.08140: in VariableManager get_vars() 19110 1726882543.08160: done with get_vars() 19110 1726882543.08199: in VariableManager get_vars() 19110 1726882543.08202: done with get_vars() 19110 1726882543.08204: variable 'playbook_dir' from source: magic vars 19110 1726882543.08205: variable 'ansible_playbook_python' from source: magic vars 19110 1726882543.08206: variable 'ansible_config_file' from source: magic vars 19110 1726882543.08207: variable 'groups' from source: magic vars 19110 1726882543.08207: variable 'omit' from source: magic vars 19110 1726882543.08208: variable 'ansible_version' from source: magic vars 19110 1726882543.08209: variable 'ansible_check_mode' from source: magic vars 19110 1726882543.08209: variable 'ansible_diff_mode' from source: magic vars 19110 1726882543.08210: variable 'ansible_forks' from source: magic vars 19110 1726882543.08211: variable 'ansible_inventory_sources' from source: magic vars 19110 1726882543.08211: variable 'ansible_skip_tags' from source: magic vars 19110 1726882543.08212: variable 'ansible_limit' from source: magic vars 19110 1726882543.08213: variable 'ansible_run_tags' from source: magic vars 19110 1726882543.08214: variable 'ansible_verbosity' from source: magic vars 19110 1726882543.08246: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 19110 1726882543.08319: in VariableManager get_vars() 19110 1726882543.08330: done with get_vars() 19110 1726882543.08373: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19110 1726882543.08502: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19110 1726882543.08582: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19110 1726882543.09039: in VariableManager get_vars() 19110 1726882543.09057: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882543.10575: in VariableManager get_vars() 19110 1726882543.10588: done with get_vars() 19110 1726882543.10620: in VariableManager get_vars() 19110 1726882543.10630: done with get_vars() 19110 1726882543.10687: in VariableManager get_vars() 19110 1726882543.10698: done with get_vars() 19110 1726882543.10770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 19110 1726882543.10784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 19110 1726882543.12474: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 19110 1726882543.12627: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 19110 1726882543.12630: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 19110 1726882543.12662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 19110 1726882543.12689: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 19110 1726882543.12855: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 19110 1726882543.12919: Loaded config def from plugin (callback/default) 19110 1726882543.12921: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19110 1726882543.14039: Loaded config def from plugin (callback/junit) 19110 1726882543.14041: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19110 1726882543.14084: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 19110 1726882543.14139: Loaded config def from plugin (callback/minimal) 19110 1726882543.14141: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19110 1726882543.14176: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19110 1726882543.14231: Loaded config def from plugin (callback/tree) 19110 1726882543.14233: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 19110 1726882543.14346: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 19110 1726882543.14348: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 19110 1726882543.14372: in VariableManager get_vars() 19110 1726882543.14383: done with get_vars() 19110 1726882543.14388: in VariableManager get_vars() 19110 1726882543.14394: done with get_vars() 19110 1726882543.14397: variable 'omit' from source: magic vars 19110 1726882543.14427: in VariableManager get_vars() 19110 1726882543.14440: done with get_vars() 19110 1726882543.14459: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 19110 1726882543.14978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 19110 1726882543.15047: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 19110 1726882543.15079: getting the remaining hosts for this loop 19110 1726882543.15081: done getting the remaining hosts for this loop 19110 1726882543.15083: getting the next task for host managed_node1 19110 1726882543.15086: done getting next task for host managed_node1 19110 1726882543.15088: ^ task is: TASK: Gathering Facts 19110 1726882543.15090: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882543.15096: getting variables 19110 1726882543.15097: in VariableManager get_vars() 19110 1726882543.15106: Calling all_inventory to load vars for managed_node1 19110 1726882543.15108: Calling groups_inventory to load vars for managed_node1 19110 1726882543.15111: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882543.15121: Calling all_plugins_play to load vars for managed_node1 19110 1726882543.15132: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882543.15135: Calling groups_plugins_play to load vars for managed_node1 19110 1726882543.15175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882543.15226: done with get_vars() 19110 1726882543.15233: done getting variables 19110 1726882543.15303: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 21:35:43 -0400 (0:00:00.010) 0:00:00.010 ****** 19110 1726882543.15324: entering _queue_task() for managed_node1/gather_facts 19110 1726882543.15326: Creating lock for gather_facts 19110 1726882543.15630: worker is 1 (out of 1 available) 19110 1726882543.15641: exiting _queue_task() for managed_node1/gather_facts 19110 1726882543.15653: done queuing things up, now waiting for results queue to drain 19110 1726882543.15655: waiting for pending results... 19110 1726882543.15882: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882543.15968: in run() - task 0e448fcc-3ce9-5372-c19a-00000000007c 19110 1726882543.15986: variable 'ansible_search_path' from source: unknown 19110 1726882543.16026: calling self._execute() 19110 1726882543.16087: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882543.16097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882543.16111: variable 'omit' from source: magic vars 19110 1726882543.16206: variable 'omit' from source: magic vars 19110 1726882543.16241: variable 'omit' from source: magic vars 19110 1726882543.16280: variable 'omit' from source: magic vars 19110 1726882543.16331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882543.16371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882543.16394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882543.16415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882543.16433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882543.16465: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882543.16474: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882543.16480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882543.16580: Set connection var ansible_timeout to 10 19110 1726882543.16597: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882543.16607: Set connection var ansible_shell_executable to /bin/sh 19110 1726882543.16613: Set connection var ansible_shell_type to sh 19110 1726882543.16619: Set connection var ansible_connection to ssh 19110 1726882543.16627: Set connection var ansible_pipelining to False 19110 1726882543.16689: variable 'ansible_shell_executable' from source: unknown 19110 1726882543.16698: variable 'ansible_connection' from source: unknown 19110 1726882543.16705: variable 'ansible_module_compression' from source: unknown 19110 1726882543.16712: variable 'ansible_shell_type' from source: unknown 19110 1726882543.16718: variable 'ansible_shell_executable' from source: unknown 19110 1726882543.16725: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882543.16732: variable 'ansible_pipelining' from source: unknown 19110 1726882543.16738: variable 'ansible_timeout' from source: unknown 19110 1726882543.16745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882543.16929: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882543.16945: variable 'omit' from source: magic vars 19110 1726882543.16955: starting attempt loop 19110 1726882543.16962: running the handler 19110 1726882543.16984: variable 'ansible_facts' from source: unknown 19110 1726882543.17004: _low_level_execute_command(): starting 19110 1726882543.17015: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882543.17752: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.17774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.17792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.17811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.17860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.17878: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.17893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.17912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.17925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.17937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.17954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.17971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.17988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.18001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.18013: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.18028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.18108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.18132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.18150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.18291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.19960: stdout chunk (state=3): >>>/root <<< 19110 1726882543.20067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.20149: stderr chunk (state=3): >>><<< 19110 1726882543.20165: stdout chunk (state=3): >>><<< 19110 1726882543.20289: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882543.20292: _low_level_execute_command(): starting 19110 1726882543.20295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291 `" && echo ansible-tmp-1726882543.2019453-19118-85818211078291="` echo /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291 `" ) && sleep 0' 19110 1726882543.20891: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.20912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.20927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.20950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.20995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.21006: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.21019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.21034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.21052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.21069: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.21081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.21094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.21108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.21119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.21129: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.21141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.21227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.21247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.21267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.21399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.23266: stdout chunk (state=3): >>>ansible-tmp-1726882543.2019453-19118-85818211078291=/root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291 <<< 19110 1726882543.23451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.23458: stdout chunk (state=3): >>><<< 19110 1726882543.23461: stderr chunk (state=3): >>><<< 19110 1726882543.23673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882543.2019453-19118-85818211078291=/root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882543.23676: variable 'ansible_module_compression' from source: unknown 19110 1726882543.23679: ANSIBALLZ: Using generic lock for ansible.legacy.setup 19110 1726882543.23681: ANSIBALLZ: Acquiring lock 19110 1726882543.23683: ANSIBALLZ: Lock acquired: 139855634067296 19110 1726882543.23686: ANSIBALLZ: Creating module 19110 1726882543.56779: ANSIBALLZ: Writing module into payload 19110 1726882543.56959: ANSIBALLZ: Writing module 19110 1726882543.56986: ANSIBALLZ: Renaming module 19110 1726882543.56990: ANSIBALLZ: Done creating module 19110 1726882543.57024: variable 'ansible_facts' from source: unknown 19110 1726882543.57030: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882543.57048: _low_level_execute_command(): starting 19110 1726882543.57052: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 19110 1726882543.57712: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.57734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.57737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.57744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.57789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.57792: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.57795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.57818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.57821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.57824: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.57826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.57838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.57848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.57859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.57861: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.57877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.57961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.57971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.57975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.58118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.59807: stdout chunk (state=3): >>>PLATFORM <<< 19110 1726882543.59889: stdout chunk (state=3): >>>Linux <<< 19110 1726882543.59903: stdout chunk (state=3): >>>FOUND <<< 19110 1726882543.59916: stdout chunk (state=3): >>>/usr/bin/python3.9 <<< 19110 1726882543.59919: stdout chunk (state=3): >>>/usr/bin/python3 <<< 19110 1726882543.59920: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 19110 1726882543.60073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.60105: stderr chunk (state=3): >>><<< 19110 1726882543.60108: stdout chunk (state=3): >>><<< 19110 1726882543.60139: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882543.60149 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 19110 1726882543.60196: _low_level_execute_command(): starting 19110 1726882543.60199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 19110 1726882543.60583: Sending initial data 19110 1726882543.60588: Sent initial data (1181 bytes) 19110 1726882543.60976: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.60990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.61003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.61020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.61069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.61085: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.61102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.61121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.61139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.61151: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.61169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.61186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.61208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.61222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.61234: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.61260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.61344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.61374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.61390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.61512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.65256: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 19110 1726882543.65681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.65714: stderr chunk (state=3): >>><<< 19110 1726882543.66071: stdout chunk (state=3): >>><<< 19110 1726882543.66075: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882543.66078: variable 'ansible_facts' from source: unknown 19110 1726882543.66080: variable 'ansible_facts' from source: unknown 19110 1726882543.66082: variable 'ansible_module_compression' from source: unknown 19110 1726882543.66083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882543.66085: variable 'ansible_facts' from source: unknown 19110 1726882543.66113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/AnsiballZ_setup.py 19110 1726882543.66289: Sending initial data 19110 1726882543.66292: Sent initial data (153 bytes) 19110 1726882543.67275: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.67292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.67308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.67326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.67369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.67383: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.67401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.67418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.67429: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.67438: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.67448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.67467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.67484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.67497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.67511: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.67524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.67643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.67671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.67688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.67801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.69534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882543.69623: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882543.69735: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpbdpfm0ip /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/AnsiballZ_setup.py <<< 19110 1726882543.69810: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882543.72353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.72501: stderr chunk (state=3): >>><<< 19110 1726882543.72507: stdout chunk (state=3): >>><<< 19110 1726882543.72510: done transferring module to remote 19110 1726882543.72512: _low_level_execute_command(): starting 19110 1726882543.72514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/ /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/AnsiballZ_setup.py && sleep 0' 19110 1726882543.72868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882543.72876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.72884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.72893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.72920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.72927: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.72934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.72945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.72957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.72960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882543.72967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.72977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.72986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.72994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.72999: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882543.73003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.73051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.73075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.73185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.74922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882543.74969: stderr chunk (state=3): >>><<< 19110 1726882543.74972: stdout chunk (state=3): >>><<< 19110 1726882543.74979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882543.74981: _low_level_execute_command(): starting 19110 1726882543.74986: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/AnsiballZ_setup.py && sleep 0' 19110 1726882543.75374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.75380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.75408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882543.75418: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882543.75423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.75432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882543.75439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882543.75444: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882543.75452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882543.75465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882543.75478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882543.75519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882543.75532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882543.75540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882543.75654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882543.77585: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 19110 1726882543.77611: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 19110 1726882543.77616: stdout chunk (state=3): >>>import '_weakref' # <<< 19110 1726882543.77679: stdout chunk (state=3): >>>import '_io' # <<< 19110 1726882543.77687: stdout chunk (state=3): >>>import 'marshal' # <<< 19110 1726882543.77712: stdout chunk (state=3): >>>import 'posix' # <<< 19110 1726882543.77741: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 19110 1726882543.77791: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 19110 1726882543.77795: stdout chunk (state=3): >>> # installed zipimport hook <<< 19110 1726882543.77846: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.77879: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 19110 1726882543.77891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 19110 1726882543.77900: stdout chunk (state=3): >>>import '_codecs' # <<< 19110 1726882543.77919: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3dc0> <<< 19110 1726882543.77969: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 19110 1726882543.77979: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3b20> <<< 19110 1726882543.78002: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 19110 1726882543.78021: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3ac0> <<< 19110 1726882543.78036: stdout chunk (state=3): >>>import '_signal' # <<< 19110 1726882543.78063: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 19110 1726882543.78089: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198490> <<< 19110 1726882543.78102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 19110 1726882543.78122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 19110 1726882543.78130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882543.78141: stdout chunk (state=3): >>>import '_abc' # <<< 19110 1726882543.78152: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198940> <<< 19110 1726882543.78170: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198670> <<< 19110 1726882543.78204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 19110 1726882543.78209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19110 1726882543.78230: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19110 1726882543.78244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 19110 1726882543.78269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19110 1726882543.78284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19110 1726882543.78314: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f190> <<< 19110 1726882543.78321: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19110 1726882543.78346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19110 1726882543.78419: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f220> <<< 19110 1726882543.78442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 19110 1726882543.78446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19110 1726882543.78475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 19110 1726882543.78482: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09172850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f940> <<< 19110 1726882543.78521: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091b0880> <<< 19110 1726882543.78538: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 19110 1726882543.78543: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09148d90> <<< 19110 1726882543.78605: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 19110 1726882543.78608: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09172d90> <<< 19110 1726882543.78666: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198970> <<< 19110 1726882543.78685: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19110 1726882543.79013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 19110 1726882543.79017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19110 1726882543.79049: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 19110 1726882543.79054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19110 1726882543.79081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 19110 1726882543.79090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19110 1726882543.79110: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 19110 1726882543.79122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19110 1726882543.79137: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090eeeb0> <<< 19110 1726882543.79179: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090f1f40> <<< 19110 1726882543.79200: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 19110 1726882543.79208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19110 1726882543.79225: stdout chunk (state=3): >>>import '_sre' # <<< 19110 1726882543.79242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19110 1726882543.79261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 19110 1726882543.79284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 19110 1726882543.79293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 19110 1726882543.79301: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090e7610> <<< 19110 1726882543.79323: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090ed640> <<< 19110 1726882543.79330: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090ee370> <<< 19110 1726882543.79351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19110 1726882543.79416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19110 1726882543.79439: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19110 1726882543.79468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.79489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19110 1726882543.79526: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08d91dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d918b0> <<< 19110 1726882543.79539: stdout chunk (state=3): >>>import 'itertools' # <<< 19110 1726882543.79566: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 19110 1726882543.79578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91eb0> <<< 19110 1726882543.79581: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 19110 1726882543.79601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19110 1726882543.79621: stdout chunk (state=3): >>>import '_operator' # <<< 19110 1726882543.79626: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91f70> <<< 19110 1726882543.79649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 19110 1726882543.79652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 19110 1726882543.79655: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91e80> <<< 19110 1726882543.79674: stdout chunk (state=3): >>>import '_collections' # <<< 19110 1726882543.79720: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c9d30> <<< 19110 1726882543.79723: stdout chunk (state=3): >>>import '_functools' # <<< 19110 1726882543.79747: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c2610> <<< 19110 1726882543.79803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 19110 1726882543.79810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090d5670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090f5e20> <<< 19110 1726882543.79831: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19110 1726882543.79861: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.79868: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08da3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c9250> <<< 19110 1726882543.79910: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.79913: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf090d5280> <<< 19110 1726882543.79916: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090fb9d0> <<< 19110 1726882543.79940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 19110 1726882543.79965: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 19110 1726882543.79970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.79994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 19110 1726882543.79997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 19110 1726882543.80019: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3fa0> <<< 19110 1726882543.80021: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3d90> <<< 19110 1726882543.80046: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3d00> <<< 19110 1726882543.80067: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 19110 1726882543.80104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 19110 1726882543.80110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882543.80130: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19110 1726882543.80182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 19110 1726882543.80205: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882543.80218: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d76370> <<< 19110 1726882543.80221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 19110 1726882543.80241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19110 1726882543.80270: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d76460> <<< 19110 1726882543.80392: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08daafa0> <<< 19110 1726882543.80422: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5a30> <<< 19110 1726882543.80439: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5490> <<< 19110 1726882543.80458: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 19110 1726882543.80475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19110 1726882543.80497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19110 1726882543.80517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 19110 1726882543.80532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 19110 1726882543.80549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cc41c0> <<< 19110 1726882543.80589: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d61c70> <<< 19110 1726882543.80629: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5eb0> <<< 19110 1726882543.80636: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090fb040> <<< 19110 1726882543.80660: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19110 1726882543.80686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 19110 1726882543.80704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 19110 1726882543.80712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cd6af0> <<< 19110 1726882543.80722: stdout chunk (state=3): >>>import 'errno' # <<< 19110 1726882543.80755: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.80778: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cd6e20> <<< 19110 1726882543.80781: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 19110 1726882543.80783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 19110 1726882543.80809: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 19110 1726882543.80814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce8730> <<< 19110 1726882543.80837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 19110 1726882543.80867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19110 1726882543.80896: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce8c70> <<< 19110 1726882543.80932: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.80935: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c803a0> <<< 19110 1726882543.80939: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cd6f10> <<< 19110 1726882543.80962: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 19110 1726882543.80967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19110 1726882543.81012: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c91280> <<< 19110 1726882543.81026: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce85b0> <<< 19110 1726882543.81031: stdout chunk (state=3): >>>import 'pwd' # <<< 19110 1726882543.81054: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.81060: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c91340> <<< 19110 1726882543.81095: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da39d0> <<< 19110 1726882543.81112: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19110 1726882543.81132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 19110 1726882543.81148: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 19110 1726882543.81170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19110 1726882543.81192: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.81201: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac6a0> <<< 19110 1726882543.81221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 19110 1726882543.81248: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cac760> <<< 19110 1726882543.81271: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac850> <<< 19110 1726882543.81298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19110 1726882543.81489: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cacca0> <<< 19110 1726882543.81521: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.81527: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cb91f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cac8e0> <<< 19110 1726882543.81548: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ca0a30> <<< 19110 1726882543.81570: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da35b0> <<< 19110 1726882543.81593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 19110 1726882543.81646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 19110 1726882543.81686: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08caca90> <<< 19110 1726882543.81819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 19110 1726882543.81837: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdf08bdb670> <<< 19110 1726882543.82109: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 19110 1726882543.82198: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.82227: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 19110 1726882543.82234: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.82250: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.82262: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 19110 1726882543.82279: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.83458: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.84371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 19110 1726882543.84376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085447f0> <<< 19110 1726882543.84396: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 19110 1726882543.84401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.84422: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 19110 1726882543.84427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 19110 1726882543.84446: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 19110 1726882543.84478: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085d4760> <<< 19110 1726882543.84512: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4640> <<< 19110 1726882543.84542: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4370> <<< 19110 1726882543.84562: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19110 1726882543.84609: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4490> <<< 19110 1726882543.84613: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4190> <<< 19110 1726882543.84619: stdout chunk (state=3): >>>import 'atexit' # <<< 19110 1726882543.84647: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.84660: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085d4400> <<< 19110 1726882543.84667: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 19110 1726882543.84693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19110 1726882543.84728: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d47c0> <<< 19110 1726882543.84749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 19110 1726882543.84767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 19110 1726882543.84790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19110 1726882543.84800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19110 1726882543.84825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19110 1726882543.84904: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085b07c0> <<< 19110 1726882543.84938: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085b0b50> <<< 19110 1726882543.84969: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085b09a0> <<< 19110 1726882543.84995: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 19110 1726882543.84997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19110 1726882543.85034: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf084c74f0> <<< 19110 1726882543.85045: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ced30> <<< 19110 1726882543.85215: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4520> <<< 19110 1726882543.85229: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 19110 1726882543.85234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 19110 1726882543.85248: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ce190> <<< 19110 1726882543.85274: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 19110 1726882543.85282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19110 1726882543.85316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19110 1726882543.85326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 19110 1726882543.85344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19110 1726882543.85361: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 19110 1726882543.85371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ffa90> <<< 19110 1726882543.85445: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085a2190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085a2790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf084cdd00> <<< 19110 1726882543.85475: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085a26a0> <<< 19110 1726882543.85507: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.85510: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b4ed30> <<< 19110 1726882543.85533: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 19110 1726882543.85540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 19110 1726882543.85565: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19110 1726882543.85593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19110 1726882543.85658: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085259a0> <<< 19110 1726882543.85667: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b58e50> <<< 19110 1726882543.85681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 19110 1726882543.85698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19110 1726882543.85747: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.85750: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085350d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b58e20> <<< 19110 1726882543.85778: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19110 1726882543.85807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.85838: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 19110 1726882543.85848: stdout chunk (state=3): >>>import '_string' # <<< 19110 1726882543.85899: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b60220> <<< 19110 1726882543.86029: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08535100> <<< 19110 1726882543.86117: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.86122: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085f9b80> <<< 19110 1726882543.86149: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.86155: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08b58ac0> <<< 19110 1726882543.86185: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.86198: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08b58d00> <<< 19110 1726882543.86200: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08bdb820> <<< 19110 1726882543.86231: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19110 1726882543.86245: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 19110 1726882543.86263: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19110 1726882543.86307: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085310d0> <<< 19110 1726882543.86483: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08527370> <<< 19110 1726882543.86493: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08531d00> <<< 19110 1726882543.86522: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.86534: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085316a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08532130> <<< 19110 1726882543.86541: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86557: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86568: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 19110 1726882543.86576: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86648: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86726: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86729: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86744: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 19110 1726882543.86753: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86771: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86775: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 19110 1726882543.86779: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86878: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.86969: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.87411: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.87860: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 19110 1726882543.87877: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 19110 1726882543.87883: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 19110 1726882543.87900: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 19110 1726882543.87915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.87967: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf0856d8b0> <<< 19110 1726882543.88033: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 19110 1726882543.88037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 19110 1726882543.88047: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08572910> <<< 19110 1726882543.88053: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c56a0> <<< 19110 1726882543.88096: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 19110 1726882543.88107: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.88127: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.88144: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 19110 1726882543.88149: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.88271: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.88394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 19110 1726882543.88399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19110 1726882543.88422: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ac7f0> <<< 19110 1726882543.88433: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.88816: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89183: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89232: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89300: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 19110 1726882543.89305: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89339: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89373: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 19110 1726882543.89379: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89430: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89506: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 19110 1726882543.89519: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89523: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89537: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 19110 1726882543.89543: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89574: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89612: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 19110 1726882543.89617: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89804: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.89990: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19110 1726882543.90019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 19110 1726882543.90027: stdout chunk (state=3): >>>import '_ast' # <<< 19110 1726882543.90098: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c9d90> # zipimport: zlib available <<< 19110 1726882543.90165: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90224: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 19110 1726882543.90237: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 19110 1726882543.90246: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 19110 1726882543.90260: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90299: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90329: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 19110 1726882543.90338: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90375: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90416: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90503: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90567: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19110 1726882543.90586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.90655: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.90662: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085600a0> <<< 19110 1726882543.90744: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08090070> <<< 19110 1726882543.90779: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 19110 1726882543.90787: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 19110 1726882543.90789: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90841: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90893: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90919: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.90951: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 19110 1726882543.90971: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 19110 1726882543.90988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 19110 1726882543.91021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 19110 1726882543.91039: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 19110 1726882543.91065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19110 1726882543.91140: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08569160> <<< 19110 1726882543.91179: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08565cd0> <<< 19110 1726882543.91235: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 19110 1726882543.91241: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91272: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91292: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 19110 1726882543.91297: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 19110 1726882543.91369: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 19110 1726882543.91387: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91390: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91400: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 19110 1726882543.91403: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91465: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91512: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91533: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91550: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91587: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91621: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91655: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91685: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 19110 1726882543.91695: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91760: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91824: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91839: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.91875: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 19110 1726882543.91882: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92023: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92162: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92197: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92257: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 19110 1726882543.92261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882543.92275: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 19110 1726882543.92285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 19110 1726882543.92294: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 19110 1726882543.92311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 19110 1726882543.92333: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07e44a60> <<< 19110 1726882543.92361: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 19110 1726882543.92383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 19110 1726882543.92408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 19110 1726882543.92446: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 19110 1726882543.92450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 19110 1726882543.92453: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080a46d0> <<< 19110 1726882543.92490: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.92495: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf080a4af0> <<< 19110 1726882543.92551: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08089250> <<< 19110 1726882543.92570: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08089a30> <<< 19110 1726882543.92593: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8460> <<< 19110 1726882543.92603: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8910> <<< 19110 1726882543.92618: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 19110 1726882543.92641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 19110 1726882543.92665: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 19110 1726882543.92671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 19110 1726882543.92698: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.92704: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf080d6d00> <<< 19110 1726882543.92714: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d6d60> <<< 19110 1726882543.92739: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 19110 1726882543.92746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 19110 1726882543.92780: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d6250> <<< 19110 1726882543.92787: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 19110 1726882543.92811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 19110 1726882543.92836: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.92841: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07eacf70> <<< 19110 1726882543.92867: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080e6b50> <<< 19110 1726882543.92892: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8310> <<< 19110 1726882543.92904: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 19110 1726882543.92913: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 19110 1726882543.92927: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92944: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 19110 1726882543.92952: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.92999: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93055: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 19110 1726882543.93067: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93101: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93144: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 19110 1726882543.93156: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93162: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93180: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 19110 1726882543.93185: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93213: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93235: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 19110 1726882543.93249: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93287: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93332: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 19110 1726882543.93336: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93375: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93409: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 19110 1726882543.93422: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93472: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93525: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93567: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.93624: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 19110 1726882543.93633: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94017: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94381: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 19110 1726882543.94390: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94428: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94492: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94511: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94540: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 19110 1726882543.94550: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94581: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94600: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 19110 1726882543.94618: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94658: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94709: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 19110 1726882543.94713: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94742: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94767: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 19110 1726882543.94780: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94802: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94832: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 19110 1726882543.94839: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94905: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.94977: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 19110 1726882543.94981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 19110 1726882543.95003: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07dc4ca0> <<< 19110 1726882543.95027: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 19110 1726882543.95041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 19110 1726882543.95196: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07dc4fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 19110 1726882543.95211: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95268: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95321: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 19110 1726882543.95330: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95403: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95486: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 19110 1726882543.95489: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95538: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95609: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 19110 1726882543.95614: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95651: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.95691: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 19110 1726882543.95716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 19110 1726882543.95871: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07daa370> <<< 19110 1726882543.96115: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07e07bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 19110 1726882543.96119: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96163: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96220: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 19110 1726882543.96223: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96284: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96359: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96448: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96596: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 19110 1726882543.96599: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96628: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96679: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 19110 1726882543.96704: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 19110 1726882543.96805: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882543.96809: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07d3e160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3e2b0> <<< 19110 1726882543.96826: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 19110 1726882543.96829: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 19110 1726882543.96851: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96889: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.96927: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 19110 1726882543.96932: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97060: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97188: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 19110 1726882543.97191: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97271: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97352: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97383: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97428: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 19110 1726882543.97431: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 19110 1726882543.97437: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97512: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97535: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97643: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97763: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 19110 1726882543.97781: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97879: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.97979: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 19110 1726882543.97984: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.98014: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.98040: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.98476: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99313: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 19110 1726882543.99328: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 19110 1726882543.99377: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99535: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 19110 1726882543.99538: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99540: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99542: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 19110 1726882543.99544: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99582: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99623: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 19110 1726882543.99626: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99705: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99789: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882543.99959: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00137: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 19110 1726882544.00141: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00174: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00211: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 19110 1726882544.00214: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00225: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00260: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 19110 1726882544.00325: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00381: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 19110 1726882544.00396: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00407: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00434: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 19110 1726882544.00488: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00536: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 19110 1726882544.00548: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00591: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00649: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 19110 1726882544.00652: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.00868: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01087: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 19110 1726882544.01090: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01131: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01190: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 19110 1726882544.01193: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01226: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01246: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 19110 1726882544.01261: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01287: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01327: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 19110 1726882544.01340: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01357: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01386: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 19110 1726882544.01398: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01453: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01542: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 19110 1726882544.01569: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 19110 1726882544.01583: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01605: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01662: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 19110 1726882544.01680: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01691: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01734: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01774: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01835: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01895: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 19110 1726882544.01909: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 19110 1726882544.01916: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.01952: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02002: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 19110 1726882544.02011: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02172: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02330: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 19110 1726882544.02338: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02375: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02419: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 19110 1726882544.02460: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02506: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 19110 1726882544.02512: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02581: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02649: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 19110 1726882544.02659: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02732: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.02806: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 19110 1726882544.02813: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 19110 1726882544.02883: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.03049: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 19110 1726882544.03071: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 19110 1726882544.03079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 19110 1726882544.03110: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.03114: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07d9f0d0> <<< 19110 1726882544.03121: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d9fca0> <<< 19110 1726882544.03176: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07ba8b80> <<< 19110 1726882544.05310: stdout chunk (state=3): >>>import 'gc' # <<< 19110 1726882544.10716: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 19110 1726882544.10740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 19110 1726882544.10756: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d9f670> <<< 19110 1726882544.10767: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 19110 1726882544.10790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 19110 1726882544.10805: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3e5e0> <<< 19110 1726882544.10873: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.10909: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 19110 1726882544.10912: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07b9f2e0> <<< 19110 1726882544.10924: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3a8b0> <<< 19110 1726882544.11157: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 19110 1726882544.31570: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distri<<< 19110 1726882544.31575: stdout chunk (state=3): >>>bution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "la<<< 19110 1726882544.31582: stdout chunk (state=3): >>>rge_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": <<< 19110 1726882544.31584: stdout chunk (state=3): >>>"512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 702, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239337472, "block_size": 4096, "block_total": 65519355, "block_available": 64511557, "block_used": 1007798, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "44", "epoch": "1726882544", "epoch_int": "1726882544", "date": "2024-09-20", "time": "21:35:44", "iso8601_micro": "2024-09-21T01:35:44.311079Z", "iso8601": "2024-09-21T01:35:44Z", "iso8601_basic": "20240920T213544311079", "iso8601_basic_short": "20240920T213544", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.46, "5m": 0.39, "15m": 0.21}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882544.32124: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 19110 1726882544.32136: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time <<< 19110 1726882544.32139: stdout chunk (state=3): >>># cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale <<< 19110 1726882544.32152: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator <<< 19110 1726882544.32258: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors <<< 19110 1726882544.32265: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 19110 1726882544.32272: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 19110 1726882544.32275: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue <<< 19110 1726882544.32282: stdout chunk (state=3): >>># cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor <<< 19110 1726882544.32284: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 19110 1726882544.32294: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 19110 1726882544.32296: stdout chunk (state=3): >>># cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd <<< 19110 1726882544.32313: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 19110 1726882544.32591: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19110 1726882544.32596: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc <<< 19110 1726882544.32599: stdout chunk (state=3): >>># destroy importlib.machinery <<< 19110 1726882544.32627: stdout chunk (state=3): >>># destroy zipimport <<< 19110 1726882544.32633: stdout chunk (state=3): >>># destroy _compression <<< 19110 1726882544.32657: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 19110 1726882544.32684: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 19110 1726882544.32691: stdout chunk (state=3): >>># destroy encodings <<< 19110 1726882544.32707: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 19110 1726882544.32762: stdout chunk (state=3): >>># destroy selinux <<< 19110 1726882544.32769: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 19110 1726882544.32813: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 19110 1726882544.32821: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool <<< 19110 1726882544.32824: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle <<< 19110 1726882544.32839: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 19110 1726882544.32875: stdout chunk (state=3): >>># destroy shlex <<< 19110 1726882544.32881: stdout chunk (state=3): >>># destroy datetime <<< 19110 1726882544.32909: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux <<< 19110 1726882544.32917: stdout chunk (state=3): >>># destroy getpass <<< 19110 1726882544.32920: stdout chunk (state=3): >>># destroy json <<< 19110 1726882544.32979: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 19110 1726882544.32982: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 19110 1726882544.32985: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array <<< 19110 1726882544.32987: stdout chunk (state=3): >>># destroy multiprocessing.dummy.connection <<< 19110 1726882544.33078: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep <<< 19110 1726882544.33086: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 19110 1726882544.33090: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 19110 1726882544.33093: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 19110 1726882544.33099: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 19110 1726882544.33105: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess <<< 19110 1726882544.33142: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 19110 1726882544.33148: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 19110 1726882544.33167: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 19110 1726882544.33225: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 19110 1726882544.33268: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 19110 1726882544.33272: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 19110 1726882544.33274: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 19110 1726882544.33304: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle <<< 19110 1726882544.33320: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19110 1726882544.33476: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 19110 1726882544.33497: stdout chunk (state=3): >>># destroy tokenize <<< 19110 1726882544.33517: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 19110 1726882544.33536: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 19110 1726882544.33561: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 19110 1726882544.33575: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19110 1726882544.33614: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19110 1726882544.33940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882544.34028: stderr chunk (state=3): >>><<< 19110 1726882544.34031: stdout chunk (state=3): >>><<< 19110 1726882544.34296: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09172850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf0914f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf091b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09148d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09172d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf09198970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090eeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090f1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090e7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090ee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08d91dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d918b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d91e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c9d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c2610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090d5670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090f5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08da3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090c9250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf090d5280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090fb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da3d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d76370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d76460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08daafa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cc41c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08d61c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da5eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf090fb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cd6af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cd6e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce8730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce8c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c803a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cd6f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c91280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ce85b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08c91340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da39d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cac760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cac850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cacca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08cb91f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08cac8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08ca0a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08da35b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08caca90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdf08bdb670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085447f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085d4760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085d4400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d47c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085b07c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085b0b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085b09a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf084c74f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ced30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085d4520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ce190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ffa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085a2190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085a2790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf084cdd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085a26a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b4ed30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085259a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b58e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085350d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b58e20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08b60220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08535100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085f9b80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08b58ac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08b58d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08bdb820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085310d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf08527370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08531d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085316a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08532130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf0856d8b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08572910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c56a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf085ac7f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c9d90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf085600a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08090070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08569160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08565cd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080c9bb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07e44a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080a46d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf080a4af0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08089250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf08089a30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf080d6d00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d6d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d6250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07eacf70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080e6b50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf080d8310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07dc4ca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07dc4fd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07daa370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07e07bb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07d3e160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3e2b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_mp007qwc/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdf07d9f0d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d9fca0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07ba8b80> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d9f670> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3e5e0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07b9f2e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdf07d3a8b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 702, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239337472, "block_size": 4096, "block_total": 65519355, "block_available": 64511557, "block_used": 1007798, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "44", "epoch": "1726882544", "epoch_int": "1726882544", "date": "2024-09-20", "time": "21:35:44", "iso8601_micro": "2024-09-21T01:35:44.311079Z", "iso8601": "2024-09-21T01:35:44Z", "iso8601_basic": "20240920T213544311079", "iso8601_basic_short": "20240920T213544", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.46, "5m": 0.39, "15m": 0.21}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 19110 1726882544.36007: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882544.36010: _low_level_execute_command(): starting 19110 1726882544.36012: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882543.2019453-19118-85818211078291/ > /dev/null 2>&1 && sleep 0' 19110 1726882544.37246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882544.37257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.37270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.37282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.37326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.37408: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882544.37418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.37426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882544.37434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882544.37442: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882544.37449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.37459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.37472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.37479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.37486: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882544.37497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.37573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.37628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882544.37643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.37761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.39687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882544.39691: stdout chunk (state=3): >>><<< 19110 1726882544.39694: stderr chunk (state=3): >>><<< 19110 1726882544.39709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882544.39717: handler run complete 19110 1726882544.39856: variable 'ansible_facts' from source: unknown 19110 1726882544.39959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.41447: variable 'ansible_facts' from source: unknown 19110 1726882544.41522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.41631: attempt loop complete, returning result 19110 1726882544.41634: _execute() done 19110 1726882544.41637: dumping result to json 19110 1726882544.41667: done dumping result, returning 19110 1726882544.41676: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-00000000007c] 19110 1726882544.41682: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000007c 19110 1726882544.42417: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000007c 19110 1726882544.42420: WORKER PROCESS EXITING ok: [managed_node1] 19110 1726882544.42736: no more pending results, returning what we have 19110 1726882544.42739: results queue empty 19110 1726882544.42740: checking for any_errors_fatal 19110 1726882544.42744: done checking for any_errors_fatal 19110 1726882544.42745: checking for max_fail_percentage 19110 1726882544.42746: done checking for max_fail_percentage 19110 1726882544.42747: checking to see if all hosts have failed and the running result is not ok 19110 1726882544.42748: done checking to see if all hosts have failed 19110 1726882544.42749: getting the remaining hosts for this loop 19110 1726882544.42750: done getting the remaining hosts for this loop 19110 1726882544.42756: getting the next task for host managed_node1 19110 1726882544.42762: done getting next task for host managed_node1 19110 1726882544.42765: ^ task is: TASK: meta (flush_handlers) 19110 1726882544.42767: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882544.42771: getting variables 19110 1726882544.42772: in VariableManager get_vars() 19110 1726882544.42794: Calling all_inventory to load vars for managed_node1 19110 1726882544.42797: Calling groups_inventory to load vars for managed_node1 19110 1726882544.42800: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882544.42809: Calling all_plugins_play to load vars for managed_node1 19110 1726882544.42812: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882544.42820: Calling groups_plugins_play to load vars for managed_node1 19110 1726882544.43013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.43285: done with get_vars() 19110 1726882544.43296: done getting variables 19110 1726882544.43374: in VariableManager get_vars() 19110 1726882544.43383: Calling all_inventory to load vars for managed_node1 19110 1726882544.43386: Calling groups_inventory to load vars for managed_node1 19110 1726882544.43388: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882544.43392: Calling all_plugins_play to load vars for managed_node1 19110 1726882544.43394: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882544.43401: Calling groups_plugins_play to load vars for managed_node1 19110 1726882544.43706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.43920: done with get_vars() 19110 1726882544.43933: done queuing things up, now waiting for results queue to drain 19110 1726882544.43936: results queue empty 19110 1726882544.43937: checking for any_errors_fatal 19110 1726882544.43939: done checking for any_errors_fatal 19110 1726882544.43939: checking for max_fail_percentage 19110 1726882544.43940: done checking for max_fail_percentage 19110 1726882544.43941: checking to see if all hosts have failed and the running result is not ok 19110 1726882544.43942: done checking to see if all hosts have failed 19110 1726882544.43943: getting the remaining hosts for this loop 19110 1726882544.43944: done getting the remaining hosts for this loop 19110 1726882544.43946: getting the next task for host managed_node1 19110 1726882544.43951: done getting next task for host managed_node1 19110 1726882544.43953: ^ task is: TASK: Include the task 'el_repo_setup.yml' 19110 1726882544.43957: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882544.43960: getting variables 19110 1726882544.43960: in VariableManager get_vars() 19110 1726882544.43970: Calling all_inventory to load vars for managed_node1 19110 1726882544.43972: Calling groups_inventory to load vars for managed_node1 19110 1726882544.43975: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882544.43979: Calling all_plugins_play to load vars for managed_node1 19110 1726882544.43982: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882544.43984: Calling groups_plugins_play to load vars for managed_node1 19110 1726882544.44134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.44334: done with get_vars() 19110 1726882544.44346: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 21:35:44 -0400 (0:00:01.291) 0:00:01.301 ****** 19110 1726882544.44428: entering _queue_task() for managed_node1/include_tasks 19110 1726882544.44430: Creating lock for include_tasks 19110 1726882544.44711: worker is 1 (out of 1 available) 19110 1726882544.44724: exiting _queue_task() for managed_node1/include_tasks 19110 1726882544.44734: done queuing things up, now waiting for results queue to drain 19110 1726882544.44736: waiting for pending results... 19110 1726882544.44974: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 19110 1726882544.45046: in run() - task 0e448fcc-3ce9-5372-c19a-000000000006 19110 1726882544.45060: variable 'ansible_search_path' from source: unknown 19110 1726882544.45110: calling self._execute() 19110 1726882544.45483: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882544.45486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882544.45488: variable 'omit' from source: magic vars 19110 1726882544.45490: _execute() done 19110 1726882544.45493: dumping result to json 19110 1726882544.45494: done dumping result, returning 19110 1726882544.45497: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-5372-c19a-000000000006] 19110 1726882544.45499: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000006 19110 1726882544.45608: no more pending results, returning what we have 19110 1726882544.45614: in VariableManager get_vars() 19110 1726882544.45647: Calling all_inventory to load vars for managed_node1 19110 1726882544.45650: Calling groups_inventory to load vars for managed_node1 19110 1726882544.45654: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882544.45671: Calling all_plugins_play to load vars for managed_node1 19110 1726882544.45675: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882544.45679: Calling groups_plugins_play to load vars for managed_node1 19110 1726882544.45883: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000006 19110 1726882544.45889: WORKER PROCESS EXITING 19110 1726882544.45904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.46093: done with get_vars() 19110 1726882544.46100: variable 'ansible_search_path' from source: unknown 19110 1726882544.46115: we have included files to process 19110 1726882544.46116: generating all_blocks data 19110 1726882544.46118: done generating all_blocks data 19110 1726882544.46118: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19110 1726882544.46120: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19110 1726882544.46122: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19110 1726882544.46919: in VariableManager get_vars() 19110 1726882544.46936: done with get_vars() 19110 1726882544.46948: done processing included file 19110 1726882544.46950: iterating over new_blocks loaded from include file 19110 1726882544.46952: in VariableManager get_vars() 19110 1726882544.46962: done with get_vars() 19110 1726882544.46965: filtering new block on tags 19110 1726882544.46979: done filtering new block on tags 19110 1726882544.46982: in VariableManager get_vars() 19110 1726882544.46991: done with get_vars() 19110 1726882544.46992: filtering new block on tags 19110 1726882544.47007: done filtering new block on tags 19110 1726882544.47010: in VariableManager get_vars() 19110 1726882544.47036: done with get_vars() 19110 1726882544.47038: filtering new block on tags 19110 1726882544.47051: done filtering new block on tags 19110 1726882544.47053: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 19110 1726882544.47058: extending task lists for all hosts with included blocks 19110 1726882544.47106: done extending task lists 19110 1726882544.47107: done processing included files 19110 1726882544.47108: results queue empty 19110 1726882544.47109: checking for any_errors_fatal 19110 1726882544.47110: done checking for any_errors_fatal 19110 1726882544.47111: checking for max_fail_percentage 19110 1726882544.47112: done checking for max_fail_percentage 19110 1726882544.47112: checking to see if all hosts have failed and the running result is not ok 19110 1726882544.47113: done checking to see if all hosts have failed 19110 1726882544.47114: getting the remaining hosts for this loop 19110 1726882544.47115: done getting the remaining hosts for this loop 19110 1726882544.47117: getting the next task for host managed_node1 19110 1726882544.47121: done getting next task for host managed_node1 19110 1726882544.47123: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 19110 1726882544.47125: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882544.47127: getting variables 19110 1726882544.47128: in VariableManager get_vars() 19110 1726882544.47135: Calling all_inventory to load vars for managed_node1 19110 1726882544.47137: Calling groups_inventory to load vars for managed_node1 19110 1726882544.47139: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882544.47144: Calling all_plugins_play to load vars for managed_node1 19110 1726882544.47146: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882544.47149: Calling groups_plugins_play to load vars for managed_node1 19110 1726882544.47744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882544.47944: done with get_vars() 19110 1726882544.47952: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:35:44 -0400 (0:00:00.035) 0:00:01.337 ****** 19110 1726882544.48016: entering _queue_task() for managed_node1/setup 19110 1726882544.48263: worker is 1 (out of 1 available) 19110 1726882544.48276: exiting _queue_task() for managed_node1/setup 19110 1726882544.48286: done queuing things up, now waiting for results queue to drain 19110 1726882544.48288: waiting for pending results... 19110 1726882544.49101: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 19110 1726882544.49174: in run() - task 0e448fcc-3ce9-5372-c19a-00000000008d 19110 1726882544.49184: variable 'ansible_search_path' from source: unknown 19110 1726882544.49187: variable 'ansible_search_path' from source: unknown 19110 1726882544.49232: calling self._execute() 19110 1726882544.49305: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882544.49314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882544.49326: variable 'omit' from source: magic vars 19110 1726882544.49868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882544.53256: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882544.53344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882544.53391: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882544.53433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882544.53472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882544.53550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882544.53590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882544.53620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882544.53667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882544.53689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882544.53862: variable 'ansible_facts' from source: unknown 19110 1726882544.53942: variable 'network_test_required_facts' from source: task vars 19110 1726882544.53986: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 19110 1726882544.53999: variable 'omit' from source: magic vars 19110 1726882544.54039: variable 'omit' from source: magic vars 19110 1726882544.54077: variable 'omit' from source: magic vars 19110 1726882544.54110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882544.54141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882544.54165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882544.54187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882544.54204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882544.54238: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882544.54246: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882544.54252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882544.54352: Set connection var ansible_timeout to 10 19110 1726882544.54372: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882544.54382: Set connection var ansible_shell_executable to /bin/sh 19110 1726882544.54390: Set connection var ansible_shell_type to sh 19110 1726882544.54398: Set connection var ansible_connection to ssh 19110 1726882544.54407: Set connection var ansible_pipelining to False 19110 1726882544.54438: variable 'ansible_shell_executable' from source: unknown 19110 1726882544.54447: variable 'ansible_connection' from source: unknown 19110 1726882544.54455: variable 'ansible_module_compression' from source: unknown 19110 1726882544.54461: variable 'ansible_shell_type' from source: unknown 19110 1726882544.54471: variable 'ansible_shell_executable' from source: unknown 19110 1726882544.54478: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882544.54486: variable 'ansible_pipelining' from source: unknown 19110 1726882544.54493: variable 'ansible_timeout' from source: unknown 19110 1726882544.54500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882544.54644: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882544.54659: variable 'omit' from source: magic vars 19110 1726882544.54674: starting attempt loop 19110 1726882544.54681: running the handler 19110 1726882544.54696: _low_level_execute_command(): starting 19110 1726882544.54706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882544.55480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882544.55494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.55507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.55529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.55571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.55583: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882544.55596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.55612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882544.55626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882544.55636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882544.55647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.55660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.55677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.55688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.55698: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882544.55710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.55790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.55811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882544.55825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.55968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.57640: stdout chunk (state=3): >>>/root <<< 19110 1726882544.57827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882544.57830: stdout chunk (state=3): >>><<< 19110 1726882544.57833: stderr chunk (state=3): >>><<< 19110 1726882544.57942: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882544.57946: _low_level_execute_command(): starting 19110 1726882544.57948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849 `" && echo ansible-tmp-1726882544.5785136-19158-199492729758849="` echo /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849 `" ) && sleep 0' 19110 1726882544.58981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882544.58997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.59013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.59030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.59075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.59088: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882544.59101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.59117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882544.59127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882544.59137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882544.59151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.59166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.59182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.59193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.59203: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882544.59215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.59295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.59316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882544.59331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.59452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.61325: stdout chunk (state=3): >>>ansible-tmp-1726882544.5785136-19158-199492729758849=/root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849 <<< 19110 1726882544.61485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882544.61516: stderr chunk (state=3): >>><<< 19110 1726882544.61519: stdout chunk (state=3): >>><<< 19110 1726882544.61772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882544.5785136-19158-199492729758849=/root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882544.61776: variable 'ansible_module_compression' from source: unknown 19110 1726882544.61778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882544.61780: variable 'ansible_facts' from source: unknown 19110 1726882544.61931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/AnsiballZ_setup.py 19110 1726882544.62154: Sending initial data 19110 1726882544.62158: Sent initial data (154 bytes) 19110 1726882544.64015: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882544.64029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.64043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.64060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.64107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.64118: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882544.64129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.64144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882544.64153: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882544.64162: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882544.64177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.64193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.64288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.64305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.64320: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882544.64333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.64526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.64547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882544.64565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.64692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.66492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882544.66584: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882544.66679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpsrtjheas /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/AnsiballZ_setup.py <<< 19110 1726882544.66797: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882544.69575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882544.69837: stderr chunk (state=3): >>><<< 19110 1726882544.69840: stdout chunk (state=3): >>><<< 19110 1726882544.69843: done transferring module to remote 19110 1726882544.69845: _low_level_execute_command(): starting 19110 1726882544.69847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/ /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/AnsiballZ_setup.py && sleep 0' 19110 1726882544.70508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882544.70524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.70541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.70566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.70607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.70620: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882544.70635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.70672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882544.70686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882544.70698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882544.70711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.70725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.70742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.70762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882544.70783: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882544.70799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.70922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.70945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882544.70972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.71334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.72914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882544.72969: stderr chunk (state=3): >>><<< 19110 1726882544.72980: stdout chunk (state=3): >>><<< 19110 1726882544.73070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882544.73073: _low_level_execute_command(): starting 19110 1726882544.73076: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/AnsiballZ_setup.py && sleep 0' 19110 1726882544.73799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882544.73803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882544.73842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882544.73846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882544.73848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882544.73906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882544.73909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882544.74021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882544.76388: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3b20> <<< 19110 1726882544.76418: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 19110 1726882544.76425: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3ac0> <<< 19110 1726882544.76444: stdout chunk (state=3): >>>import '_signal' # <<< 19110 1726882544.76482: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 19110 1726882544.76490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 19110 1726882544.76492: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58490> <<< 19110 1726882544.76522: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 19110 1726882544.76530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882544.76542: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58940> <<< 19110 1726882544.76573: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58670> <<< 19110 1726882544.76609: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 19110 1726882544.76613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19110 1726882544.76624: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19110 1726882544.76651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 19110 1726882544.76671: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19110 1726882544.76690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19110 1726882544.76714: stdout chunk (state=3): >>>import '_stat' # <<< 19110 1726882544.76717: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f190> <<< 19110 1726882544.76725: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19110 1726882544.76745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19110 1726882544.76819: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f220> <<< 19110 1726882544.76853: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 19110 1726882544.76862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19110 1726882544.76886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb32850> <<< 19110 1726882544.76891: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f940> <<< 19110 1726882544.76910: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb70880> <<< 19110 1726882544.76933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 19110 1726882544.76943: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb08d90> <<< 19110 1726882544.77000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 19110 1726882544.77002: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb32d90> <<< 19110 1726882544.77060: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58970> <<< 19110 1726882544.77083: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19110 1726882544.77419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 19110 1726882544.77422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19110 1726882544.77452: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19110 1726882544.77473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 19110 1726882544.77488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19110 1726882544.77514: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 19110 1726882544.77521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19110 1726882544.77532: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad3eb0> <<< 19110 1726882544.77579: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad6f40> <<< 19110 1726882544.77598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 19110 1726882544.77610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19110 1726882544.77627: stdout chunk (state=3): >>>import '_sre' # <<< 19110 1726882544.77637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19110 1726882544.77660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 19110 1726882544.77684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 19110 1726882544.77690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 19110 1726882544.77706: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cacc610> <<< 19110 1726882544.77726: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad2640> <<< 19110 1726882544.77735: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad3370> <<< 19110 1726882544.77753: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19110 1726882544.77834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19110 1726882544.77852: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19110 1726882544.77884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.77901: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19110 1726882544.77947: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.77955: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c790e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790910> <<< 19110 1726882544.77960: stdout chunk (state=3): >>>import 'itertools' # <<< 19110 1726882544.77993: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 19110 1726882544.77998: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790f10> <<< 19110 1726882544.78004: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 19110 1726882544.78041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19110 1726882544.78046: stdout chunk (state=3): >>>import '_operator' # <<< 19110 1726882544.78051: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790fd0> <<< 19110 1726882544.78070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 19110 1726882544.78087: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a30d0> import '_collections' # <<< 19110 1726882544.78143: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caaed90> <<< 19110 1726882544.78146: stdout chunk (state=3): >>>import '_functools' # <<< 19110 1726882544.78170: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caa7670> <<< 19110 1726882544.78227: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 19110 1726882544.78260: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caba6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cadae20> <<< 19110 1726882544.78269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19110 1726882544.78294: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c7a3cd0> <<< 19110 1726882544.78297: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caae2b0> <<< 19110 1726882544.78345: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.78348: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681caba2e0> <<< 19110 1726882544.78373: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cae09d0> <<< 19110 1726882544.78379: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 19110 1726882544.78397: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.78416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 19110 1726882544.78442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 19110 1726882544.78448: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3df0> <<< 19110 1726882544.78498: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 19110 1726882544.78501: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3d60> <<< 19110 1726882544.78507: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 19110 1726882544.78513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 19110 1726882544.78524: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 19110 1726882544.78535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882544.78559: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19110 1726882544.78607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 19110 1726882544.78638: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7763d0> <<< 19110 1726882544.78658: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 19110 1726882544.78676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19110 1726882544.78700: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7764c0> <<< 19110 1726882544.78825: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7aaf40> <<< 19110 1726882544.78866: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5a90> <<< 19110 1726882544.78891: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5490> <<< 19110 1726882544.78904: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19110 1726882544.78930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19110 1726882544.78946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 19110 1726882544.78971: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 19110 1726882544.78984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6aa220> <<< 19110 1726882544.79010: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c761520> <<< 19110 1726882544.79072: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cae0040> <<< 19110 1726882544.79090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19110 1726882544.79120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 19110 1726882544.79139: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 19110 1726882544.79151: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6bcb50> import 'errno' # <<< 19110 1726882544.79189: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6bce80> <<< 19110 1726882544.79206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 19110 1726882544.79232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 19110 1726882544.79248: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6ce790> <<< 19110 1726882544.79269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 19110 1726882544.79302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19110 1726882544.79332: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6cecd0> <<< 19110 1726882544.79369: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c666400> <<< 19110 1726882544.79386: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6bcf70> <<< 19110 1726882544.79402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19110 1726882544.79449: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.79466: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6772e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6ce610> import 'pwd' # <<< 19110 1726882544.79494: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6773a0> <<< 19110 1726882544.79536: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3a30> <<< 19110 1726882544.79557: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19110 1726882544.79576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 19110 1726882544.79595: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 19110 1726882544.79607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19110 1726882544.79666: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c692700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 19110 1726882544.79684: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.79701: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6929d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6927c0> <<< 19110 1726882544.79717: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6928b0> <<< 19110 1726882544.79771: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19110 1726882544.79945: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.79969: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c692d00> <<< 19110 1726882544.79992: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c69d250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c692940> <<< 19110 1726882544.80007: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c686a90> <<< 19110 1726882544.80018: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3610> <<< 19110 1726882544.80043: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 19110 1726882544.80101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 19110 1726882544.80137: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c692af0> <<< 19110 1726882544.80286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 19110 1726882544.80304: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f681c5ba6d0> <<< 19110 1726882544.80562: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip' <<< 19110 1726882544.80567: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.80650: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.80694: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/__init__.py <<< 19110 1726882544.80699: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.80720: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 19110 1726882544.80732: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.81968: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.82897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf04820> <<< 19110 1726882544.82932: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 19110 1726882544.82936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.82947: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 19110 1726882544.82971: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 19110 1726882544.82999: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf92730> <<< 19110 1726882544.83034: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92610> <<< 19110 1726882544.83077: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92340> <<< 19110 1726882544.83090: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19110 1726882544.83144: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92160> <<< 19110 1726882544.83148: stdout chunk (state=3): >>>import 'atexit' # <<< 19110 1726882544.83181: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.83184: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf923a0> <<< 19110 1726882544.83194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 19110 1726882544.83223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19110 1726882544.83269: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92790> <<< 19110 1726882544.83283: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 19110 1726882544.83299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 19110 1726882544.83316: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19110 1726882544.83334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19110 1726882544.83362: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19110 1726882544.83445: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf82820> <<< 19110 1726882544.83485: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf82490> <<< 19110 1726882544.83514: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf82640> <<< 19110 1726882544.83537: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 19110 1726882544.83540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19110 1726882544.83578: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681be88520> <<< 19110 1726882544.83596: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf8dd60> <<< 19110 1726882544.83760: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf924f0> <<< 19110 1726882544.83776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 19110 1726882544.83799: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf8d1c0> <<< 19110 1726882544.83820: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 19110 1726882544.83831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19110 1726882544.83866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19110 1726882544.83888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 19110 1726882544.83894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19110 1726882544.83930: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 19110 1726882544.83933: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf91b20> <<< 19110 1726882544.84014: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf61160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf61760> <<< 19110 1726882544.84017: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681be8ed30> <<< 19110 1726882544.84057: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf61670> <<< 19110 1726882544.84081: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.84084: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfe3d00> <<< 19110 1726882544.84108: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 19110 1726882544.84112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 19110 1726882544.84128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19110 1726882544.84167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19110 1726882544.84227: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bee5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfede80> <<< 19110 1726882544.84258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 19110 1726882544.84273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19110 1726882544.84333: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bef30a0> <<< 19110 1726882544.84336: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfedeb0> <<< 19110 1726882544.84348: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19110 1726882544.84387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.84418: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 19110 1726882544.84421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 19110 1726882544.84484: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bff5250> <<< 19110 1726882544.84610: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bef30d0> <<< 19110 1726882544.84709: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bff5a60> <<< 19110 1726882544.84739: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bfb7b80> <<< 19110 1726882544.84789: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bfedcd0> <<< 19110 1726882544.84794: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfe3ee0> <<< 19110 1726882544.84828: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 19110 1726882544.84834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19110 1726882544.84837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 19110 1726882544.84852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19110 1726882544.84897: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.84903: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681beef0d0> <<< 19110 1726882544.85087: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.85091: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bee6310> <<< 19110 1726882544.85093: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681beefcd0> <<< 19110 1726882544.85136: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.85141: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681beef670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bef0100> <<< 19110 1726882544.85143: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85174: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 19110 1726882544.85177: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85247: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85350: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85356: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85358: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 19110 1726882544.85361: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85381: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 19110 1726882544.85483: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.85576: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.86032: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.86490: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 19110 1726882544.86509: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 19110 1726882544.86532: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 19110 1726882544.86536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.86597: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf00910> <<< 19110 1726882544.86666: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 19110 1726882544.86679: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf3e9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681baa0640> <<< 19110 1726882544.86740: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available<<< 19110 1726882544.86743: stdout chunk (state=3): >>> <<< 19110 1726882544.86757: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.86783: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 19110 1726882544.86786: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.86906: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 19110 1726882544.87042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19110 1726882544.87071: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf697f0> <<< 19110 1726882544.87074: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87458: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87822: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87880: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87946: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 19110 1726882544.87949: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.87984: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88018: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 19110 1726882544.88021: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88075: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88160: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 19110 1726882544.88166: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88190: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 19110 1726882544.88194: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88220: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88265: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 19110 1726882544.88269: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88449: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19110 1726882544.88669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 19110 1726882544.88681: stdout chunk (state=3): >>>import '_ast' # <<< 19110 1726882544.88753: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfaf460> # zipimport: zlib available <<< 19110 1726882544.88812: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88897: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 19110 1726882544.88901: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 19110 1726882544.88912: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88952: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.88997: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 19110 1726882544.89000: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89031: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89071: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89157: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89225: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19110 1726882544.89238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.89322: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf25f40> <<< 19110 1726882544.89423: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf3e1f0> <<< 19110 1726882544.89468: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 19110 1726882544.89471: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89523: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89575: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89602: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89640: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 19110 1726882544.89658: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 19110 1726882544.89668: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 19110 1726882544.89702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 19110 1726882544.89729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 19110 1726882544.89744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19110 1726882544.89825: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf35bb0> <<< 19110 1726882544.89869: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bffe070> <<< 19110 1726882544.89928: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf262e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 19110 1726882544.89931: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89965: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.89982: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 19110 1726882544.90059: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 19110 1726882544.90091: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19110 1726882544.90096: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 19110 1726882544.90113: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90161: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90222: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90244: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90247: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90287: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90319: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90351: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90391: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 19110 1726882544.90394: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90459: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90524: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90536: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90582: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 19110 1726882544.90585: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90726: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90870: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90900: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.90951: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 19110 1726882544.90956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882544.91000: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 19110 1726882544.91005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 19110 1726882544.91007: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 19110 1726882544.91023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 19110 1726882544.91035: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ba55400> <<< 19110 1726882544.91073: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 19110 1726882544.91077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 19110 1726882544.91090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 19110 1726882544.91119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 19110 1726882544.91151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 19110 1726882544.91156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 19110 1726882544.91170: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bab49a0> <<< 19110 1726882544.91202: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.91205: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bab4df0> <<< 19110 1726882544.91271: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bab1490> <<< 19110 1726882544.91286: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b92c040> <<< 19110 1726882544.91314: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c3a0> <<< 19110 1726882544.91317: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c5e0> <<< 19110 1726882544.91340: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 19110 1726882544.91380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 19110 1726882544.91383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 19110 1726882544.91385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 19110 1726882544.91428: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882544.91434: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf226d0> <<< 19110 1726882544.91440: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ba9f730> <<< 19110 1726882544.91474: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 19110 1726882544.91493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 19110 1726882544.91496: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf225e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 19110 1726882544.91541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 19110 1726882544.91570: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ba64c70> <<< 19110 1726882544.91604: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b87b9a0> <<< 19110 1726882544.91643: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c4f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 19110 1726882544.91669: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 19110 1726882544.91720: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.91784: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 19110 1726882544.91787: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.91813: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.91891: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 19110 1726882544.91906: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 19110 1726882544.91926: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.91959: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 19110 1726882544.92003: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92045: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 19110 1726882544.92068: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92086: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92128: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 19110 1726882544.92140: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92184: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92240: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92281: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92342: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 19110 1726882544.92353: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.92742: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.93103: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 19110 1726882544.93115: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.93168: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.93524: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available <<< 19110 1726882544.93559: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 19110 1726882544.93627: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.93701: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 19110 1726882544.93723: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c9d0> <<< 19110 1726882544.93744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 19110 1726882544.93771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 19110 1726882544.93927: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b79bf40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 19110 1726882544.93935: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.93992: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94048: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 19110 1726882544.94180: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94270: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 19110 1726882544.94273: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94473: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 19110 1726882544.94477: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 19110 1726882544.94482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 19110 1726882544.94590: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7933a0> <<< 19110 1726882544.94836: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b7e1100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 19110 1726882544.94843: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94894: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.94938: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 19110 1726882544.94945: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95013: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95088: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95177: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95313: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 19110 1726882544.95320: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95350: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95393: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 19110 1726882544.95430: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 19110 1726882544.95551: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7276a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b727a90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 19110 1726882544.95584: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95587: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 19110 1726882544.95610: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95653: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 19110 1726882544.95788: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95924: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 19110 1726882544.95928: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.95999: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96080: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96111: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96168: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 19110 1726882544.96172: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96252: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96276: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96383: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96514: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 19110 1726882544.96517: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96610: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96717: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 19110 1726882544.96757: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.96780: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97217: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97627: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 19110 1726882544.97639: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97732: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97822: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 19110 1726882544.97829: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97909: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.97998: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 19110 1726882544.98004: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98127: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98279: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 19110 1726882544.98291: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 19110 1726882544.98298: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98334: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98378: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 19110 1726882544.98385: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98468: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98552: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98719: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98886: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 19110 1726882544.98900: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98947: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98973: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 19110 1726882544.98981: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.98997: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99012: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 19110 1726882544.99027: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99087: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99156: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 19110 1726882544.99212: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99219: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 19110 1726882544.99223: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99258: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99313: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 19110 1726882544.99369: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99423: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 19110 1726882544.99428: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99640: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99862: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 19110 1726882544.99915: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99969: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 19110 1726882544.99992: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882544.99998: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00035: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 19110 1726882545.00042: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00075: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00103: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 19110 1726882545.00109: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00136: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00178: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 19110 1726882545.00247: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00319: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 19110 1726882545.00336: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00346: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 19110 1726882545.00359: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00398: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00444: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 19110 1726882545.00476: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00484: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00530: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00572: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00634: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00720: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 19110 1726882545.00723: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00766: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.00807: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 19110 1726882545.01000: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01875: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 19110 1726882545.01878: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01880: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01882: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 19110 1726882545.01884: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01885: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 19110 1726882545.01889: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01891: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01893: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 19110 1726882545.01895: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01896: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01899: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 19110 1726882545.01901: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.01982: stdout chunk (state=3): >>>import 'gc' # <<< 19110 1726882545.02885: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 19110 1726882545.02902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 19110 1726882545.02916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 19110 1726882545.02947: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7095e0> <<< 19110 1726882545.02960: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b70bc10> <<< 19110 1726882545.03019: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b70bcd0> <<< 19110 1726882545.04250: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "45", "epoch": "1726882545", "epoch_int": "1726882545", "date": "2024-09-20", "time": "21:35:45", "iso8601_micro": "2024-09-21T01:35:45.027352Z", "iso8601": "2024-09-21T01:35:45Z", "iso8601_basic": "20240920T213545027352", "iso8601_basic_short": "20240920T213545", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71Q<<< 19110 1726882545.04268: stdout chunk (state=3): >>>U/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882545.04856: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 19110 1726882545.04862: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 19110 1726882545.04870: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 19110 1726882545.04878: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator <<< 19110 1726882545.04883: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 <<< 19110 1726882545.04890: stdout chunk (state=3): >>># cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 19110 1726882545.04893: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 19110 1726882545.04896: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing <<< 19110 1726882545.04901: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 19110 1726882545.04906: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle <<< 19110 1726882545.04925: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 19110 1726882545.05177: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19110 1726882545.05195: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 19110 1726882545.05220: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 19110 1726882545.05243: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 19110 1726882545.05278: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 19110 1726882545.05285: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 19110 1726882545.05299: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 19110 1726882545.05346: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 19110 1726882545.05379: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 19110 1726882545.05409: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 19110 1726882545.05459: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process <<< 19110 1726882545.05462: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 19110 1726882545.05478: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 19110 1726882545.05524: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 19110 1726882545.05527: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 19110 1726882545.05529: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 19110 1726882545.05587: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 19110 1726882545.05595: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing <<< 19110 1726882545.05638: stdout chunk (state=3): >>># cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 19110 1726882545.05673: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 19110 1726882545.05700: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 19110 1726882545.05758: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 19110 1726882545.05775: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 19110 1726882545.05852: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19110 1726882545.05986: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 19110 1726882545.06012: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat <<< 19110 1726882545.06037: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 19110 1726882545.06050: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 19110 1726882545.06058: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19110 1726882545.06100: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19110 1726882545.06461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882545.06464: stdout chunk (state=3): >>><<< 19110 1726882545.06473: stderr chunk (state=3): >>><<< 19110 1726882545.06676: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cbb3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb0f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb70880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb08d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb32d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cb58970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cacc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cad3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c790e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c790fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a30d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caaed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caa7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caba6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cadae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c7a3cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681caae2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681caba2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cae09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7763d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7764c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7aaf40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6aa220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c761520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a5f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681cae0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6bcb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6bce80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6ce790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6cecd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c666400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6bcf70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6772e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6ce610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6773a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c692700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6929d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c6927c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c6928b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c692d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681c69d250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c692940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c686a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c7a3610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681c692af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f681c5ba6d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf04820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf92730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf923a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf92790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf82820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf82490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf82640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681be88520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf8dd60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf924f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf8d1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf91b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf61160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf61760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681be8ed30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf61670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfe3d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bee5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfede80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bef30a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfedeb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bff5250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bef30d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bff5a60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bfb7b80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bfedcd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfe3ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681beef0d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bee6310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681beefcd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681beef670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bef0100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf00910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf3e9a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681baa0640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf697f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bfaf460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf25f40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf3e1f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf35bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bffe070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf262e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ba55400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bab49a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bab4df0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bab1490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b92c040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c3a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c5e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681bf226d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681ba9f730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681bf225e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681ba64c70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b87b9a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c4f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b81c9d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b79bf40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7933a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b7e1100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7276a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b727a90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_8hyqasl7/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f681b7095e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b70bc10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f681b70bcd0> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "45", "epoch": "1726882545", "epoch_int": "1726882545", "date": "2024-09-20", "time": "21:35:45", "iso8601_micro": "2024-09-21T01:35:45.027352Z", "iso8601": "2024-09-21T01:35:45Z", "iso8601_basic": "20240920T213545027352", "iso8601_basic_short": "20240920T213545", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 19110 1726882545.07837: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882545.07840: _low_level_execute_command(): starting 19110 1726882545.07842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882544.5785136-19158-199492729758849/ > /dev/null 2>&1 && sleep 0' 19110 1726882545.07844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.07847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.07849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.07851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.07852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.07854: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.07856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.07858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.07860: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.07862: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.07867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.07869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.07871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.07873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.07875: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.07877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.07879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.07881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.07883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.07925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.09813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.09816: stdout chunk (state=3): >>><<< 19110 1726882545.09819: stderr chunk (state=3): >>><<< 19110 1726882545.09969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882545.09972: handler run complete 19110 1726882545.09975: variable 'ansible_facts' from source: unknown 19110 1726882545.09977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.10102: variable 'ansible_facts' from source: unknown 19110 1726882545.10153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.10221: attempt loop complete, returning result 19110 1726882545.10230: _execute() done 19110 1726882545.10236: dumping result to json 19110 1726882545.10252: done dumping result, returning 19110 1726882545.10267: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-5372-c19a-00000000008d] 19110 1726882545.10278: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000008d ok: [managed_node1] 19110 1726882545.10562: no more pending results, returning what we have 19110 1726882545.10567: results queue empty 19110 1726882545.10569: checking for any_errors_fatal 19110 1726882545.10570: done checking for any_errors_fatal 19110 1726882545.10571: checking for max_fail_percentage 19110 1726882545.10573: done checking for max_fail_percentage 19110 1726882545.10574: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.10576: done checking to see if all hosts have failed 19110 1726882545.10577: getting the remaining hosts for this loop 19110 1726882545.10578: done getting the remaining hosts for this loop 19110 1726882545.10582: getting the next task for host managed_node1 19110 1726882545.10592: done getting next task for host managed_node1 19110 1726882545.10594: ^ task is: TASK: Check if system is ostree 19110 1726882545.10597: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.10601: getting variables 19110 1726882545.10603: in VariableManager get_vars() 19110 1726882545.10657: Calling all_inventory to load vars for managed_node1 19110 1726882545.10662: Calling groups_inventory to load vars for managed_node1 19110 1726882545.10672: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.10684: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.10687: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.10690: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.10851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.11195: done with get_vars() 19110 1726882545.11205: done getting variables 19110 1726882545.11328: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000008d 19110 1726882545.11331: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:35:45 -0400 (0:00:00.633) 0:00:01.971 ****** 19110 1726882545.11405: entering _queue_task() for managed_node1/stat 19110 1726882545.11731: worker is 1 (out of 1 available) 19110 1726882545.11744: exiting _queue_task() for managed_node1/stat 19110 1726882545.11755: done queuing things up, now waiting for results queue to drain 19110 1726882545.11763: waiting for pending results... 19110 1726882545.12007: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 19110 1726882545.12112: in run() - task 0e448fcc-3ce9-5372-c19a-00000000008f 19110 1726882545.12127: variable 'ansible_search_path' from source: unknown 19110 1726882545.12133: variable 'ansible_search_path' from source: unknown 19110 1726882545.12169: calling self._execute() 19110 1726882545.12242: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.12252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.12262: variable 'omit' from source: magic vars 19110 1726882545.12717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882545.12994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882545.13041: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882545.13090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882545.13128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882545.13223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882545.13252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882545.13296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882545.13328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882545.13456: Evaluated conditional (not __network_is_ostree is defined): True 19110 1726882545.13470: variable 'omit' from source: magic vars 19110 1726882545.13519: variable 'omit' from source: magic vars 19110 1726882545.13558: variable 'omit' from source: magic vars 19110 1726882545.13590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882545.13633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882545.13655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882545.13680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882545.13695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882545.13736: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882545.13745: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.13754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.13871: Set connection var ansible_timeout to 10 19110 1726882545.13890: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882545.13900: Set connection var ansible_shell_executable to /bin/sh 19110 1726882545.13907: Set connection var ansible_shell_type to sh 19110 1726882545.13913: Set connection var ansible_connection to ssh 19110 1726882545.13923: Set connection var ansible_pipelining to False 19110 1726882545.13959: variable 'ansible_shell_executable' from source: unknown 19110 1726882545.13969: variable 'ansible_connection' from source: unknown 19110 1726882545.13977: variable 'ansible_module_compression' from source: unknown 19110 1726882545.13984: variable 'ansible_shell_type' from source: unknown 19110 1726882545.13990: variable 'ansible_shell_executable' from source: unknown 19110 1726882545.13998: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.14006: variable 'ansible_pipelining' from source: unknown 19110 1726882545.14013: variable 'ansible_timeout' from source: unknown 19110 1726882545.14021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.14177: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882545.14192: variable 'omit' from source: magic vars 19110 1726882545.14201: starting attempt loop 19110 1726882545.14207: running the handler 19110 1726882545.14223: _low_level_execute_command(): starting 19110 1726882545.14236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882545.15001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.15021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.15042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.15061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.15106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.15118: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.15138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.15162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.15179: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.15191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.15204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.15219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.15236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.15259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.15278: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.15294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.15385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.15408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.15426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.15549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.17190: stdout chunk (state=3): >>>/root <<< 19110 1726882545.17295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.17377: stderr chunk (state=3): >>><<< 19110 1726882545.17390: stdout chunk (state=3): >>><<< 19110 1726882545.17473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882545.17482: _low_level_execute_command(): starting 19110 1726882545.17486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397 `" && echo ansible-tmp-1726882545.1741784-19191-115753587283397="` echo /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397 `" ) && sleep 0' 19110 1726882545.18080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.18089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.18103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.18126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.18174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.18187: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.18200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.18216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.18227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.18241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.18253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.18272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.18289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.18300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.18310: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.18322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.18405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.18425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.18440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.18570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.20451: stdout chunk (state=3): >>>ansible-tmp-1726882545.1741784-19191-115753587283397=/root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397 <<< 19110 1726882545.20580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.20646: stderr chunk (state=3): >>><<< 19110 1726882545.20649: stdout chunk (state=3): >>><<< 19110 1726882545.20969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882545.1741784-19191-115753587283397=/root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882545.20973: variable 'ansible_module_compression' from source: unknown 19110 1726882545.20975: ANSIBALLZ: Using lock for stat 19110 1726882545.20977: ANSIBALLZ: Acquiring lock 19110 1726882545.20979: ANSIBALLZ: Lock acquired: 139855634067968 19110 1726882545.20981: ANSIBALLZ: Creating module 19110 1726882545.39943: ANSIBALLZ: Writing module into payload 19110 1726882545.40099: ANSIBALLZ: Writing module 19110 1726882545.40129: ANSIBALLZ: Renaming module 19110 1726882545.40141: ANSIBALLZ: Done creating module 19110 1726882545.40166: variable 'ansible_facts' from source: unknown 19110 1726882545.40265: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/AnsiballZ_stat.py 19110 1726882545.40434: Sending initial data 19110 1726882545.40438: Sent initial data (153 bytes) 19110 1726882545.41593: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.41607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.41621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.41646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.41694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.41707: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.41721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.41741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.41759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.41774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.41786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.41798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.41813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.41824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.41834: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.41850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.41983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.42706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.42726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.42926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.44781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882545.44878: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882545.44984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpt6w8giy6 /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/AnsiballZ_stat.py <<< 19110 1726882545.45079: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882545.46756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.46760: stderr chunk (state=3): >>><<< 19110 1726882545.46763: stdout chunk (state=3): >>><<< 19110 1726882545.46767: done transferring module to remote 19110 1726882545.46769: _low_level_execute_command(): starting 19110 1726882545.46771: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/ /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/AnsiballZ_stat.py && sleep 0' 19110 1726882545.48275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.48311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.48344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.48369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.48438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.48479: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.48494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.48517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.48636: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.48648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.48668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.48685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.48701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.48713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.48724: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.48743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.48821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.48844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.48863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.49077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.50970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.50973: stdout chunk (state=3): >>><<< 19110 1726882545.50991: stderr chunk (state=3): >>><<< 19110 1726882545.51086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882545.51089: _low_level_execute_command(): starting 19110 1726882545.51091: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/AnsiballZ_stat.py && sleep 0' 19110 1726882545.52488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.52492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.52529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882545.52533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.52535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.52738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.52886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.52889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.52997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.55059: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 19110 1726882545.55067: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 19110 1726882545.55133: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 19110 1726882545.55182: stdout chunk (state=3): >>>import 'posix' # <<< 19110 1726882545.55210: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 19110 1726882545.55213: stdout chunk (state=3): >>># installing zipimport hook <<< 19110 1726882545.55248: stdout chunk (state=3): >>>import 'time' # <<< 19110 1726882545.55251: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 19110 1726882545.55309: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.55327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 19110 1726882545.55351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 19110 1726882545.55361: stdout chunk (state=3): >>>import '_codecs' # <<< 19110 1726882545.55375: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317dc0> <<< 19110 1726882545.55432: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 19110 1726882545.55438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 19110 1726882545.55441: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317b20> <<< 19110 1726882545.55471: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 19110 1726882545.55475: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317ac0> <<< 19110 1726882545.55490: stdout chunk (state=3): >>>import '_signal' # <<< 19110 1726882545.55516: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 19110 1726882545.55535: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc490> <<< 19110 1726882545.55557: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 19110 1726882545.55589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882545.55592: stdout chunk (state=3): >>>import '_abc' # <<< 19110 1726882545.55612: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc940> <<< 19110 1726882545.55615: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc670> <<< 19110 1726882545.55658: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 19110 1726882545.55666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19110 1726882545.55684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19110 1726882545.55699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 19110 1726882545.55717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19110 1726882545.55734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19110 1726882545.55767: stdout chunk (state=3): >>>import '_stat' # <<< 19110 1726882545.55771: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273190> <<< 19110 1726882545.55793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19110 1726882545.55796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19110 1726882545.55883: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273220> <<< 19110 1726882545.55902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19110 1726882545.55929: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 19110 1726882545.55933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d296850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273940> <<< 19110 1726882545.55969: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2d4880> <<< 19110 1726882545.55991: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d26cd90> <<< 19110 1726882545.56053: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 19110 1726882545.56058: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d296d90> <<< 19110 1726882545.56107: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc970> <<< 19110 1726882545.56135: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19110 1726882545.56332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19110 1726882545.56373: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19110 1726882545.56392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 19110 1726882545.56407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19110 1726882545.56441: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 19110 1726882545.56444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19110 1726882545.56459: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd3eb0> <<< 19110 1726882545.56496: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd6f40> <<< 19110 1726882545.56524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 19110 1726882545.56528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19110 1726882545.56547: stdout chunk (state=3): >>>import '_sre' # <<< 19110 1726882545.56578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19110 1726882545.56584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 19110 1726882545.56606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 19110 1726882545.56625: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfcc610> <<< 19110 1726882545.56638: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd3370> <<< 19110 1726882545.56670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19110 1726882545.56739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19110 1726882545.56768: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19110 1726882545.56791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.56810: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19110 1726882545.56865: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cf54e20> <<< 19110 1726882545.56869: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54910> <<< 19110 1726882545.56875: stdout chunk (state=3): >>>import 'itertools' # <<< 19110 1726882545.56902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54f10> <<< 19110 1726882545.56905: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 19110 1726882545.56935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19110 1726882545.56953: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54fd0> <<< 19110 1726882545.56988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 19110 1726882545.56991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf670d0> <<< 19110 1726882545.57003: stdout chunk (state=3): >>>import '_collections' # <<< 19110 1726882545.57056: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfaed90> <<< 19110 1726882545.57060: stdout chunk (state=3): >>>import '_functools' # <<< 19110 1726882545.57077: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfa7670> <<< 19110 1726882545.57149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 19110 1726882545.57152: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfb96d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfdae20> <<< 19110 1726882545.57176: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19110 1726882545.57192: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cf67cd0> <<< 19110 1726882545.57203: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfae2b0> <<< 19110 1726882545.57236: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.57250: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cfb92e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfe09d0> <<< 19110 1726882545.57271: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 19110 1726882545.57287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 19110 1726882545.57303: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.57318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 19110 1726882545.57344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 19110 1726882545.57358: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67df0> <<< 19110 1726882545.57382: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67d60> <<< 19110 1726882545.57404: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 19110 1726882545.57419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 19110 1726882545.57437: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 19110 1726882545.57456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882545.57470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19110 1726882545.57519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 19110 1726882545.57543: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 19110 1726882545.57570: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf3a3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 19110 1726882545.57582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19110 1726882545.57671: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf3a4c0> <<< 19110 1726882545.57730: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf6ef40> <<< 19110 1726882545.57768: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69a90> <<< 19110 1726882545.57785: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69490> <<< 19110 1726882545.57804: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19110 1726882545.57837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19110 1726882545.57852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 19110 1726882545.57874: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 19110 1726882545.57890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce6e220> <<< 19110 1726882545.57915: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf25520> <<< 19110 1726882545.57969: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69f10> <<< 19110 1726882545.57984: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfe0040> <<< 19110 1726882545.58007: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19110 1726882545.58028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 19110 1726882545.58048: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce80b50> <<< 19110 1726882545.58066: stdout chunk (state=3): >>>import 'errno' # <<< 19110 1726882545.58086: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.58109: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce80e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 19110 1726882545.58130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 19110 1726882545.58144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91790> <<< 19110 1726882545.58223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19110 1726882545.58235: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91cd0> <<< 19110 1726882545.58335: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce2a400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce80f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19110 1726882545.58355: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce3b2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91610> <<< 19110 1726882545.58377: stdout chunk (state=3): >>>import 'pwd' # <<< 19110 1726882545.58390: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce3b3a0> <<< 19110 1726882545.58438: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67a30> <<< 19110 1726882545.58453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19110 1726882545.58474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 19110 1726882545.58490: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 19110 1726882545.58505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19110 1726882545.58552: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce56700> <<< 19110 1726882545.58576: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 19110 1726882545.58594: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce569d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce567c0> <<< 19110 1726882545.58613: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.58628: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce568b0> <<< 19110 1726882545.58652: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19110 1726882545.58851: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.58879: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce56d00> <<< 19110 1726882545.58899: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce61250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce56940> <<< 19110 1726882545.58914: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce4aa90> <<< 19110 1726882545.58925: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67610> <<< 19110 1726882545.58948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 19110 1726882545.59011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 19110 1726882545.59050: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce56af0> <<< 19110 1726882545.59153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 19110 1726882545.59171: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f331cd7a6d0> <<< 19110 1726882545.59340: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip' <<< 19110 1726882545.59343: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.59430: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.59481: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/__init__.py <<< 19110 1726882545.59484: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.59487: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.59514: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 19110 1726882545.59516: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.60791: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.61775: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c724820> <<< 19110 1726882545.61801: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.61827: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 19110 1726882545.61852: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 19110 1726882545.61893: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.61896: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7b4730> <<< 19110 1726882545.61933: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4610> <<< 19110 1726882545.61973: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4340> <<< 19110 1726882545.61988: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19110 1726882545.62051: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4160> <<< 19110 1726882545.62076: stdout chunk (state=3): >>>import 'atexit' # <<< 19110 1726882545.62090: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7b43a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 19110 1726882545.62127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19110 1726882545.62162: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4790> <<< 19110 1726882545.62198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 19110 1726882545.62201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 19110 1726882545.62217: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19110 1726882545.62238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19110 1726882545.62272: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 19110 1726882545.62275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19110 1726882545.62344: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6a47f0> <<< 19110 1726882545.62385: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6a4b80> <<< 19110 1726882545.62412: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 19110 1726882545.62415: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6a49d0> <<< 19110 1726882545.62429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 19110 1726882545.62458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19110 1726882545.62492: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6c3af0> <<< 19110 1726882545.62505: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7add60> <<< 19110 1726882545.62675: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b44f0> <<< 19110 1726882545.62693: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 19110 1726882545.62746: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7ad1c0> <<< 19110 1726882545.62749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 19110 1726882545.62751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19110 1726882545.62800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19110 1726882545.62806: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 19110 1726882545.62810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19110 1726882545.62830: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c720b20> <<< 19110 1726882545.62920: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c755eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7558b0> <<< 19110 1726882545.62925: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6bed30> <<< 19110 1726882545.62957: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7559a0> <<< 19110 1726882545.62988: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c784d00> <<< 19110 1726882545.63015: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 19110 1726882545.63020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 19110 1726882545.63045: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19110 1726882545.63073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19110 1726882545.63146: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c685a00> <<< 19110 1726882545.63150: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c78ce80> <<< 19110 1726882545.63172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 19110 1726882545.63184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19110 1726882545.63239: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6940a0> <<< 19110 1726882545.63244: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c78ceb0> <<< 19110 1726882545.63270: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19110 1726882545.63329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.63333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 19110 1726882545.63336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 19110 1726882545.63400: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c759730> <<< 19110 1726882545.63526: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6940d0> <<< 19110 1726882545.63622: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c691550> <<< 19110 1726882545.63656: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c691610> <<< 19110 1726882545.63698: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c690c40> <<< 19110 1726882545.63705: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c784ee0> <<< 19110 1726882545.63727: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19110 1726882545.63739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 19110 1726882545.63760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19110 1726882545.63805: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c714b50> <<< 19110 1726882545.64040: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c713940> <<< 19110 1726882545.64044: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c687820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7145b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c74daf0> <<< 19110 1726882545.64068: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.64087: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 19110 1726882545.64290: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 19110 1726882545.64293: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 19110 1726882545.64300: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.64404: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.64498: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.64971: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.65432: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 19110 1726882545.65439: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 19110 1726882545.65460: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 19110 1726882545.65533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c28fdf0> <<< 19110 1726882545.65622: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6615b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c652df0> <<< 19110 1726882545.65678: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 19110 1726882545.65692: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.65716: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 19110 1726882545.65860: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.65973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19110 1726882545.65996: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c70a9d0> <<< 19110 1726882545.66004: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.66392: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.66758: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.66810: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.66884: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 19110 1726882545.66916: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.66946: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 19110 1726882545.66956: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67009: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67099: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 19110 1726882545.67114: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 19110 1726882545.67122: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67156: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67192: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 19110 1726882545.67197: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67381: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67568: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19110 1726882545.67602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 19110 1726882545.67684: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c262e50> # zipimport: zlib available <<< 19110 1726882545.67745: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67824: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 19110 1726882545.67861: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.67882: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68033: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 19110 1726882545.68099: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68155: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19110 1726882545.68169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19110 1726882545.68257: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c79e910> <<< 19110 1726882545.68280: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c262be0> <<< 19110 1726882545.68310: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 19110 1726882545.68323: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68472: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68511: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68538: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68590: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 19110 1726882545.68603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 19110 1726882545.68640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 19110 1726882545.68661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 19110 1726882545.68676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19110 1726882545.68770: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c223c70> <<< 19110 1726882545.68808: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c654670> <<< 19110 1726882545.68868: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c653850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 19110 1726882545.68874: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68904: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.68924: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 19110 1726882545.68993: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 19110 1726882545.69027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19110 1726882545.69035: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 19110 1726882545.69149: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.69319: stdout chunk (state=3): >>># zipimport: zlib available <<< 19110 1726882545.69460: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 19110 1726882545.69737: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 19110 1726882545.69744: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 19110 1726882545.69770: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 19110 1726882545.69803: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc<<< 19110 1726882545.69849: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime <<< 19110 1726882545.69859: stdout chunk (state=3): >>># cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 19110 1726882545.70045: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19110 1726882545.70051: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 19110 1726882545.70081: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 19110 1726882545.70110: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 19110 1726882545.70132: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 19110 1726882545.70160: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array # destroy datetime <<< 19110 1726882545.70169: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 19110 1726882545.70230: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 19110 1726882545.70282: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 19110 1726882545.70312: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 19110 1726882545.70340: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 19110 1726882545.70371: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 19110 1726882545.70405: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 19110 1726882545.70427: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket <<< 19110 1726882545.70434: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19110 1726882545.70596: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 19110 1726882545.70622: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat <<< 19110 1726882545.70637: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 19110 1726882545.70657: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19110 1726882545.70687: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19110 1726882545.71071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882545.71074: stdout chunk (state=3): >>><<< 19110 1726882545.71083: stderr chunk (state=3): >>><<< 19110 1726882545.71227: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d317ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d296850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d273940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2d4880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d26cd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d296d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331d2bc970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfcc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfd3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cf54e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf54fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf670d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfaed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfa7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfb96d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfdae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cf67cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfae2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331cfb92e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfe09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf3a3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf3a4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf6ef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce6e220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf25520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf69f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cfe0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce80b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce80e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce2a400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce80f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce3b2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce91610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce3b3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce56700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce569d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce567c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce568b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce56d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331ce61250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce56940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce4aa90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331cf67610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331ce56af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f331cd7a6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c724820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7b4730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7b43a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b4790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6a47f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6a4b80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6a49d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6c3af0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7add60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7b44f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7ad1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c720b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c755eb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c7558b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6bed30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7559a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c784d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c685a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c78ce80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c6940a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c78ceb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c759730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6940d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c691550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c691610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c690c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c784ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c714b50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c713940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c687820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c7145b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c74daf0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c28fdf0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c6615b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c652df0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c70a9d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c262e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f331c79e910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c262be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c223c70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c654670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f331c653850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_72t4nluk/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 19110 1726882545.71775: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882545.71779: _low_level_execute_command(): starting 19110 1726882545.71781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882545.1741784-19191-115753587283397/ > /dev/null 2>&1 && sleep 0' 19110 1726882545.71998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882545.72007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.72016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.72029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.72076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.72083: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882545.72095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.72105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882545.72112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882545.72118: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882545.72125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882545.72133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882545.72144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882545.72159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882545.72165: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882545.72174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882545.72245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882545.72265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882545.72282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882545.72399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882545.74339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882545.74344: stdout chunk (state=3): >>><<< 19110 1726882545.74352: stderr chunk (state=3): >>><<< 19110 1726882545.74774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882545.74779: handler run complete 19110 1726882545.74781: attempt loop complete, returning result 19110 1726882545.74783: _execute() done 19110 1726882545.74785: dumping result to json 19110 1726882545.74787: done dumping result, returning 19110 1726882545.74789: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-5372-c19a-00000000008f] 19110 1726882545.74791: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000008f 19110 1726882545.74857: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000008f 19110 1726882545.74860: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 19110 1726882545.75223: no more pending results, returning what we have 19110 1726882545.75226: results queue empty 19110 1726882545.75227: checking for any_errors_fatal 19110 1726882545.75233: done checking for any_errors_fatal 19110 1726882545.75234: checking for max_fail_percentage 19110 1726882545.75236: done checking for max_fail_percentage 19110 1726882545.75237: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.75237: done checking to see if all hosts have failed 19110 1726882545.75238: getting the remaining hosts for this loop 19110 1726882545.75239: done getting the remaining hosts for this loop 19110 1726882545.75243: getting the next task for host managed_node1 19110 1726882545.75247: done getting next task for host managed_node1 19110 1726882545.75250: ^ task is: TASK: Set flag to indicate system is ostree 19110 1726882545.75253: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.75258: getting variables 19110 1726882545.75259: in VariableManager get_vars() 19110 1726882545.75288: Calling all_inventory to load vars for managed_node1 19110 1726882545.75290: Calling groups_inventory to load vars for managed_node1 19110 1726882545.75293: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.75302: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.75305: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.75308: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.75489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.75846: done with get_vars() 19110 1726882545.75857: done getting variables 19110 1726882545.76141: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:35:45 -0400 (0:00:00.647) 0:00:02.619 ****** 19110 1726882545.76192: entering _queue_task() for managed_node1/set_fact 19110 1726882545.76194: Creating lock for set_fact 19110 1726882545.76489: worker is 1 (out of 1 available) 19110 1726882545.76504: exiting _queue_task() for managed_node1/set_fact 19110 1726882545.76516: done queuing things up, now waiting for results queue to drain 19110 1726882545.76517: waiting for pending results... 19110 1726882545.76785: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 19110 1726882545.76877: in run() - task 0e448fcc-3ce9-5372-c19a-000000000090 19110 1726882545.76888: variable 'ansible_search_path' from source: unknown 19110 1726882545.76892: variable 'ansible_search_path' from source: unknown 19110 1726882545.76924: calling self._execute() 19110 1726882545.76997: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.77001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.77011: variable 'omit' from source: magic vars 19110 1726882545.77579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882545.77822: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882545.77875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882545.77916: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882545.77957: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882545.78049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882545.78081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882545.78109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882545.78143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882545.78273: Evaluated conditional (not __network_is_ostree is defined): True 19110 1726882545.78285: variable 'omit' from source: magic vars 19110 1726882545.78346: variable 'omit' from source: magic vars 19110 1726882545.78527: variable '__ostree_booted_stat' from source: set_fact 19110 1726882545.78596: variable 'omit' from source: magic vars 19110 1726882545.78633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882545.78666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882545.78723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882545.78746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882545.78761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882545.78817: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882545.78825: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.78833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.78961: Set connection var ansible_timeout to 10 19110 1726882545.78988: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882545.78998: Set connection var ansible_shell_executable to /bin/sh 19110 1726882545.79008: Set connection var ansible_shell_type to sh 19110 1726882545.79014: Set connection var ansible_connection to ssh 19110 1726882545.79022: Set connection var ansible_pipelining to False 19110 1726882545.79049: variable 'ansible_shell_executable' from source: unknown 19110 1726882545.79070: variable 'ansible_connection' from source: unknown 19110 1726882545.79077: variable 'ansible_module_compression' from source: unknown 19110 1726882545.79083: variable 'ansible_shell_type' from source: unknown 19110 1726882545.79089: variable 'ansible_shell_executable' from source: unknown 19110 1726882545.79094: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.79101: variable 'ansible_pipelining' from source: unknown 19110 1726882545.79110: variable 'ansible_timeout' from source: unknown 19110 1726882545.79121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.79223: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882545.79241: variable 'omit' from source: magic vars 19110 1726882545.79252: starting attempt loop 19110 1726882545.79262: running the handler 19110 1726882545.79281: handler run complete 19110 1726882545.79294: attempt loop complete, returning result 19110 1726882545.79300: _execute() done 19110 1726882545.79306: dumping result to json 19110 1726882545.79312: done dumping result, returning 19110 1726882545.79321: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-5372-c19a-000000000090] 19110 1726882545.79330: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000090 19110 1726882545.79432: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000090 19110 1726882545.79439: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 19110 1726882545.79515: no more pending results, returning what we have 19110 1726882545.79518: results queue empty 19110 1726882545.79519: checking for any_errors_fatal 19110 1726882545.79526: done checking for any_errors_fatal 19110 1726882545.79527: checking for max_fail_percentage 19110 1726882545.79529: done checking for max_fail_percentage 19110 1726882545.79530: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.79531: done checking to see if all hosts have failed 19110 1726882545.79532: getting the remaining hosts for this loop 19110 1726882545.79533: done getting the remaining hosts for this loop 19110 1726882545.79537: getting the next task for host managed_node1 19110 1726882545.79545: done getting next task for host managed_node1 19110 1726882545.79548: ^ task is: TASK: Fix CentOS6 Base repo 19110 1726882545.79553: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.79559: getting variables 19110 1726882545.79561: in VariableManager get_vars() 19110 1726882545.79592: Calling all_inventory to load vars for managed_node1 19110 1726882545.79595: Calling groups_inventory to load vars for managed_node1 19110 1726882545.79598: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.79608: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.79612: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.79621: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.79801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.80013: done with get_vars() 19110 1726882545.80023: done getting variables 19110 1726882545.80274: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:35:45 -0400 (0:00:00.041) 0:00:02.661 ****** 19110 1726882545.80420: entering _queue_task() for managed_node1/copy 19110 1726882545.80717: worker is 1 (out of 1 available) 19110 1726882545.80727: exiting _queue_task() for managed_node1/copy 19110 1726882545.80741: done queuing things up, now waiting for results queue to drain 19110 1726882545.80743: waiting for pending results... 19110 1726882545.81018: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 19110 1726882545.81125: in run() - task 0e448fcc-3ce9-5372-c19a-000000000092 19110 1726882545.81143: variable 'ansible_search_path' from source: unknown 19110 1726882545.81151: variable 'ansible_search_path' from source: unknown 19110 1726882545.81199: calling self._execute() 19110 1726882545.81274: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.81290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.81308: variable 'omit' from source: magic vars 19110 1726882545.81821: variable 'ansible_distribution' from source: facts 19110 1726882545.81857: Evaluated conditional (ansible_distribution == 'CentOS'): True 19110 1726882545.81983: variable 'ansible_distribution_major_version' from source: facts 19110 1726882545.81994: Evaluated conditional (ansible_distribution_major_version == '6'): False 19110 1726882545.82002: when evaluation is False, skipping this task 19110 1726882545.82009: _execute() done 19110 1726882545.82015: dumping result to json 19110 1726882545.82023: done dumping result, returning 19110 1726882545.82032: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-5372-c19a-000000000092] 19110 1726882545.82043: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000092 19110 1726882545.82155: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000092 19110 1726882545.82167: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19110 1726882545.82227: no more pending results, returning what we have 19110 1726882545.82231: results queue empty 19110 1726882545.82232: checking for any_errors_fatal 19110 1726882545.82235: done checking for any_errors_fatal 19110 1726882545.82236: checking for max_fail_percentage 19110 1726882545.82238: done checking for max_fail_percentage 19110 1726882545.82239: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.82240: done checking to see if all hosts have failed 19110 1726882545.82240: getting the remaining hosts for this loop 19110 1726882545.82242: done getting the remaining hosts for this loop 19110 1726882545.82245: getting the next task for host managed_node1 19110 1726882545.82252: done getting next task for host managed_node1 19110 1726882545.82255: ^ task is: TASK: Include the task 'enable_epel.yml' 19110 1726882545.82258: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.82262: getting variables 19110 1726882545.82267: in VariableManager get_vars() 19110 1726882545.82295: Calling all_inventory to load vars for managed_node1 19110 1726882545.82299: Calling groups_inventory to load vars for managed_node1 19110 1726882545.82302: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.82314: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.82318: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.82321: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.82494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.82739: done with get_vars() 19110 1726882545.82749: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:35:45 -0400 (0:00:00.025) 0:00:02.687 ****** 19110 1726882545.83008: entering _queue_task() for managed_node1/include_tasks 19110 1726882545.83409: worker is 1 (out of 1 available) 19110 1726882545.83419: exiting _queue_task() for managed_node1/include_tasks 19110 1726882545.83430: done queuing things up, now waiting for results queue to drain 19110 1726882545.83431: waiting for pending results... 19110 1726882545.83666: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 19110 1726882545.83766: in run() - task 0e448fcc-3ce9-5372-c19a-000000000093 19110 1726882545.83789: variable 'ansible_search_path' from source: unknown 19110 1726882545.83796: variable 'ansible_search_path' from source: unknown 19110 1726882545.83833: calling self._execute() 19110 1726882545.83909: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.83922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.83935: variable 'omit' from source: magic vars 19110 1726882545.84411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882545.86729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882545.86799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882545.86843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882545.86882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882545.86912: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882545.86997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882545.87028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882545.87066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882545.87111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882545.87128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882545.87243: variable '__network_is_ostree' from source: set_fact 19110 1726882545.87272: Evaluated conditional (not __network_is_ostree | d(false)): True 19110 1726882545.87282: _execute() done 19110 1726882545.87288: dumping result to json 19110 1726882545.87295: done dumping result, returning 19110 1726882545.87303: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-5372-c19a-000000000093] 19110 1726882545.87311: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000093 19110 1726882545.87428: no more pending results, returning what we have 19110 1726882545.87434: in VariableManager get_vars() 19110 1726882545.87467: Calling all_inventory to load vars for managed_node1 19110 1726882545.87471: Calling groups_inventory to load vars for managed_node1 19110 1726882545.87474: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.87484: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.87488: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.87491: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.87667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.87862: done with get_vars() 19110 1726882545.87874: variable 'ansible_search_path' from source: unknown 19110 1726882545.87876: variable 'ansible_search_path' from source: unknown 19110 1726882545.87913: we have included files to process 19110 1726882545.87914: generating all_blocks data 19110 1726882545.87916: done generating all_blocks data 19110 1726882545.87925: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19110 1726882545.87926: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19110 1726882545.87928: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19110 1726882545.88453: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000093 19110 1726882545.88457: WORKER PROCESS EXITING 19110 1726882545.88901: done processing included file 19110 1726882545.88903: iterating over new_blocks loaded from include file 19110 1726882545.88905: in VariableManager get_vars() 19110 1726882545.88916: done with get_vars() 19110 1726882545.88918: filtering new block on tags 19110 1726882545.88942: done filtering new block on tags 19110 1726882545.88945: in VariableManager get_vars() 19110 1726882545.88960: done with get_vars() 19110 1726882545.88962: filtering new block on tags 19110 1726882545.88976: done filtering new block on tags 19110 1726882545.88979: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 19110 1726882545.88984: extending task lists for all hosts with included blocks 19110 1726882545.89091: done extending task lists 19110 1726882545.89093: done processing included files 19110 1726882545.89094: results queue empty 19110 1726882545.89094: checking for any_errors_fatal 19110 1726882545.89097: done checking for any_errors_fatal 19110 1726882545.89098: checking for max_fail_percentage 19110 1726882545.89099: done checking for max_fail_percentage 19110 1726882545.89100: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.89100: done checking to see if all hosts have failed 19110 1726882545.89101: getting the remaining hosts for this loop 19110 1726882545.89102: done getting the remaining hosts for this loop 19110 1726882545.89105: getting the next task for host managed_node1 19110 1726882545.89108: done getting next task for host managed_node1 19110 1726882545.89110: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 19110 1726882545.89113: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.89116: getting variables 19110 1726882545.89117: in VariableManager get_vars() 19110 1726882545.89124: Calling all_inventory to load vars for managed_node1 19110 1726882545.89126: Calling groups_inventory to load vars for managed_node1 19110 1726882545.89129: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.89134: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.89140: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.89144: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.89288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.89495: done with get_vars() 19110 1726882545.89507: done getting variables 19110 1726882545.89573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 19110 1726882545.89770: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:35:45 -0400 (0:00:00.068) 0:00:02.755 ****** 19110 1726882545.89813: entering _queue_task() for managed_node1/command 19110 1726882545.89815: Creating lock for command 19110 1726882545.90060: worker is 1 (out of 1 available) 19110 1726882545.90071: exiting _queue_task() for managed_node1/command 19110 1726882545.90082: done queuing things up, now waiting for results queue to drain 19110 1726882545.90084: waiting for pending results... 19110 1726882545.90330: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 19110 1726882545.90444: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000ad 19110 1726882545.90461: variable 'ansible_search_path' from source: unknown 19110 1726882545.90472: variable 'ansible_search_path' from source: unknown 19110 1726882545.90513: calling self._execute() 19110 1726882545.90589: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.90605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.90618: variable 'omit' from source: magic vars 19110 1726882545.90983: variable 'ansible_distribution' from source: facts 19110 1726882545.90999: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19110 1726882545.91133: variable 'ansible_distribution_major_version' from source: facts 19110 1726882545.91150: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19110 1726882545.91158: when evaluation is False, skipping this task 19110 1726882545.91170: _execute() done 19110 1726882545.91178: dumping result to json 19110 1726882545.91186: done dumping result, returning 19110 1726882545.91195: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-5372-c19a-0000000000ad] 19110 1726882545.91206: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000ad skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19110 1726882545.91359: no more pending results, returning what we have 19110 1726882545.91365: results queue empty 19110 1726882545.91366: checking for any_errors_fatal 19110 1726882545.91367: done checking for any_errors_fatal 19110 1726882545.91368: checking for max_fail_percentage 19110 1726882545.91370: done checking for max_fail_percentage 19110 1726882545.91370: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.91371: done checking to see if all hosts have failed 19110 1726882545.91372: getting the remaining hosts for this loop 19110 1726882545.91374: done getting the remaining hosts for this loop 19110 1726882545.91377: getting the next task for host managed_node1 19110 1726882545.91383: done getting next task for host managed_node1 19110 1726882545.91386: ^ task is: TASK: Install yum-utils package 19110 1726882545.91390: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.91393: getting variables 19110 1726882545.91395: in VariableManager get_vars() 19110 1726882545.91425: Calling all_inventory to load vars for managed_node1 19110 1726882545.91428: Calling groups_inventory to load vars for managed_node1 19110 1726882545.91432: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.91444: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.91447: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.91451: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.91624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.91821: done with get_vars() 19110 1726882545.91830: done getting variables 19110 1726882545.91953: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:35:45 -0400 (0:00:00.021) 0:00:02.778 ****** 19110 1726882545.92102: entering _queue_task() for managed_node1/package 19110 1726882545.92104: Creating lock for package 19110 1726882545.92134: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000ad 19110 1726882545.92138: WORKER PROCESS EXITING 19110 1726882545.92503: worker is 1 (out of 1 available) 19110 1726882545.92515: exiting _queue_task() for managed_node1/package 19110 1726882545.92525: done queuing things up, now waiting for results queue to drain 19110 1726882545.92527: waiting for pending results... 19110 1726882545.92872: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 19110 1726882545.92978: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000ae 19110 1726882545.93000: variable 'ansible_search_path' from source: unknown 19110 1726882545.93008: variable 'ansible_search_path' from source: unknown 19110 1726882545.93044: calling self._execute() 19110 1726882545.93120: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.93129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.93141: variable 'omit' from source: magic vars 19110 1726882545.93547: variable 'ansible_distribution' from source: facts 19110 1726882545.93566: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19110 1726882545.93697: variable 'ansible_distribution_major_version' from source: facts 19110 1726882545.93708: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19110 1726882545.93716: when evaluation is False, skipping this task 19110 1726882545.93723: _execute() done 19110 1726882545.93729: dumping result to json 19110 1726882545.93736: done dumping result, returning 19110 1726882545.93747: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-5372-c19a-0000000000ae] 19110 1726882545.93761: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000ae skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19110 1726882545.93902: no more pending results, returning what we have 19110 1726882545.93906: results queue empty 19110 1726882545.93907: checking for any_errors_fatal 19110 1726882545.93914: done checking for any_errors_fatal 19110 1726882545.93914: checking for max_fail_percentage 19110 1726882545.93916: done checking for max_fail_percentage 19110 1726882545.93917: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.93917: done checking to see if all hosts have failed 19110 1726882545.93918: getting the remaining hosts for this loop 19110 1726882545.93920: done getting the remaining hosts for this loop 19110 1726882545.93923: getting the next task for host managed_node1 19110 1726882545.93929: done getting next task for host managed_node1 19110 1726882545.93932: ^ task is: TASK: Enable EPEL 7 19110 1726882545.93936: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.93939: getting variables 19110 1726882545.93941: in VariableManager get_vars() 19110 1726882545.94007: Calling all_inventory to load vars for managed_node1 19110 1726882545.94010: Calling groups_inventory to load vars for managed_node1 19110 1726882545.94014: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.94025: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.94028: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.94031: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.94191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.94398: done with get_vars() 19110 1726882545.94407: done getting variables 19110 1726882545.94474: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882545.94612: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000ae 19110 1726882545.94615: WORKER PROCESS EXITING TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:35:45 -0400 (0:00:00.025) 0:00:02.803 ****** 19110 1726882545.94626: entering _queue_task() for managed_node1/command 19110 1726882545.94862: worker is 1 (out of 1 available) 19110 1726882545.94875: exiting _queue_task() for managed_node1/command 19110 1726882545.94886: done queuing things up, now waiting for results queue to drain 19110 1726882545.94887: waiting for pending results... 19110 1726882545.95107: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 19110 1726882545.95213: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000af 19110 1726882545.95234: variable 'ansible_search_path' from source: unknown 19110 1726882545.95241: variable 'ansible_search_path' from source: unknown 19110 1726882545.95281: calling self._execute() 19110 1726882545.95354: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.95370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.95382: variable 'omit' from source: magic vars 19110 1726882545.95736: variable 'ansible_distribution' from source: facts 19110 1726882545.95752: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19110 1726882545.95883: variable 'ansible_distribution_major_version' from source: facts 19110 1726882545.95894: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19110 1726882545.95907: when evaluation is False, skipping this task 19110 1726882545.95915: _execute() done 19110 1726882545.95921: dumping result to json 19110 1726882545.95928: done dumping result, returning 19110 1726882545.95937: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-5372-c19a-0000000000af] 19110 1726882545.95946: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000af 19110 1726882545.96054: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000af 19110 1726882545.96062: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19110 1726882545.96131: no more pending results, returning what we have 19110 1726882545.96135: results queue empty 19110 1726882545.96136: checking for any_errors_fatal 19110 1726882545.96142: done checking for any_errors_fatal 19110 1726882545.96143: checking for max_fail_percentage 19110 1726882545.96145: done checking for max_fail_percentage 19110 1726882545.96145: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.96146: done checking to see if all hosts have failed 19110 1726882545.96147: getting the remaining hosts for this loop 19110 1726882545.96149: done getting the remaining hosts for this loop 19110 1726882545.96154: getting the next task for host managed_node1 19110 1726882545.96161: done getting next task for host managed_node1 19110 1726882545.96165: ^ task is: TASK: Enable EPEL 8 19110 1726882545.96170: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.96173: getting variables 19110 1726882545.96175: in VariableManager get_vars() 19110 1726882545.96205: Calling all_inventory to load vars for managed_node1 19110 1726882545.96208: Calling groups_inventory to load vars for managed_node1 19110 1726882545.96212: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.96225: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.96229: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.96232: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.96405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.96601: done with get_vars() 19110 1726882545.96610: done getting variables 19110 1726882545.96691: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:35:45 -0400 (0:00:00.020) 0:00:02.824 ****** 19110 1726882545.96727: entering _queue_task() for managed_node1/command 19110 1726882545.97118: worker is 1 (out of 1 available) 19110 1726882545.97129: exiting _queue_task() for managed_node1/command 19110 1726882545.97138: done queuing things up, now waiting for results queue to drain 19110 1726882545.97140: waiting for pending results... 19110 1726882545.97386: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 19110 1726882545.97498: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000b0 19110 1726882545.97517: variable 'ansible_search_path' from source: unknown 19110 1726882545.97525: variable 'ansible_search_path' from source: unknown 19110 1726882545.97569: calling self._execute() 19110 1726882545.97709: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.97722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.97736: variable 'omit' from source: magic vars 19110 1726882545.98087: variable 'ansible_distribution' from source: facts 19110 1726882545.98108: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19110 1726882545.98246: variable 'ansible_distribution_major_version' from source: facts 19110 1726882545.98257: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19110 1726882545.98267: when evaluation is False, skipping this task 19110 1726882545.98275: _execute() done 19110 1726882545.98281: dumping result to json 19110 1726882545.98289: done dumping result, returning 19110 1726882545.98298: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-5372-c19a-0000000000b0] 19110 1726882545.98314: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000b0 19110 1726882545.98417: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000b0 19110 1726882545.98427: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19110 1726882545.98485: no more pending results, returning what we have 19110 1726882545.98489: results queue empty 19110 1726882545.98490: checking for any_errors_fatal 19110 1726882545.98495: done checking for any_errors_fatal 19110 1726882545.98496: checking for max_fail_percentage 19110 1726882545.98497: done checking for max_fail_percentage 19110 1726882545.98498: checking to see if all hosts have failed and the running result is not ok 19110 1726882545.98499: done checking to see if all hosts have failed 19110 1726882545.98500: getting the remaining hosts for this loop 19110 1726882545.98501: done getting the remaining hosts for this loop 19110 1726882545.98504: getting the next task for host managed_node1 19110 1726882545.98512: done getting next task for host managed_node1 19110 1726882545.98515: ^ task is: TASK: Enable EPEL 6 19110 1726882545.98519: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882545.98522: getting variables 19110 1726882545.98524: in VariableManager get_vars() 19110 1726882545.98593: Calling all_inventory to load vars for managed_node1 19110 1726882545.98596: Calling groups_inventory to load vars for managed_node1 19110 1726882545.98600: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882545.98612: Calling all_plugins_play to load vars for managed_node1 19110 1726882545.98615: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882545.98619: Calling groups_plugins_play to load vars for managed_node1 19110 1726882545.98778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882545.98973: done with get_vars() 19110 1726882545.98985: done getting variables 19110 1726882545.99055: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:35:45 -0400 (0:00:00.024) 0:00:02.849 ****** 19110 1726882545.99203: entering _queue_task() for managed_node1/copy 19110 1726882545.99432: worker is 1 (out of 1 available) 19110 1726882545.99443: exiting _queue_task() for managed_node1/copy 19110 1726882545.99453: done queuing things up, now waiting for results queue to drain 19110 1726882545.99455: waiting for pending results... 19110 1726882545.99683: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 19110 1726882545.99789: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000b2 19110 1726882545.99813: variable 'ansible_search_path' from source: unknown 19110 1726882545.99823: variable 'ansible_search_path' from source: unknown 19110 1726882545.99865: calling self._execute() 19110 1726882545.99939: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882545.99952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882545.99972: variable 'omit' from source: magic vars 19110 1726882546.00326: variable 'ansible_distribution' from source: facts 19110 1726882546.00346: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19110 1726882546.00469: variable 'ansible_distribution_major_version' from source: facts 19110 1726882546.00481: Evaluated conditional (ansible_distribution_major_version == '6'): False 19110 1726882546.00489: when evaluation is False, skipping this task 19110 1726882546.00496: _execute() done 19110 1726882546.00508: dumping result to json 19110 1726882546.00516: done dumping result, returning 19110 1726882546.00527: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-5372-c19a-0000000000b2] 19110 1726882546.00537: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000b2 19110 1726882546.00646: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000b2 19110 1726882546.00654: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19110 1726882546.00707: no more pending results, returning what we have 19110 1726882546.00710: results queue empty 19110 1726882546.00711: checking for any_errors_fatal 19110 1726882546.00716: done checking for any_errors_fatal 19110 1726882546.00717: checking for max_fail_percentage 19110 1726882546.00719: done checking for max_fail_percentage 19110 1726882546.00720: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.00720: done checking to see if all hosts have failed 19110 1726882546.00721: getting the remaining hosts for this loop 19110 1726882546.00723: done getting the remaining hosts for this loop 19110 1726882546.00726: getting the next task for host managed_node1 19110 1726882546.00735: done getting next task for host managed_node1 19110 1726882546.00738: ^ task is: TASK: Set network provider to 'nm' 19110 1726882546.00741: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.00747: getting variables 19110 1726882546.00749: in VariableManager get_vars() 19110 1726882546.00778: Calling all_inventory to load vars for managed_node1 19110 1726882546.00781: Calling groups_inventory to load vars for managed_node1 19110 1726882546.00784: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.00795: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.00798: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.00801: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.00965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.01146: done with get_vars() 19110 1726882546.01153: done getting variables 19110 1726882546.01191: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 21:35:46 -0400 (0:00:00.020) 0:00:02.869 ****** 19110 1726882546.01213: entering _queue_task() for managed_node1/set_fact 19110 1726882546.01364: worker is 1 (out of 1 available) 19110 1726882546.01377: exiting _queue_task() for managed_node1/set_fact 19110 1726882546.01387: done queuing things up, now waiting for results queue to drain 19110 1726882546.01389: waiting for pending results... 19110 1726882546.01521: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 19110 1726882546.01571: in run() - task 0e448fcc-3ce9-5372-c19a-000000000007 19110 1726882546.01588: variable 'ansible_search_path' from source: unknown 19110 1726882546.01612: calling self._execute() 19110 1726882546.01718: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.01723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.01731: variable 'omit' from source: magic vars 19110 1726882546.01802: variable 'omit' from source: magic vars 19110 1726882546.01823: variable 'omit' from source: magic vars 19110 1726882546.01846: variable 'omit' from source: magic vars 19110 1726882546.01880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882546.01906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882546.01925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882546.01937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.01947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.01972: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882546.01975: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.01978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.02046: Set connection var ansible_timeout to 10 19110 1726882546.02060: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882546.02066: Set connection var ansible_shell_executable to /bin/sh 19110 1726882546.02068: Set connection var ansible_shell_type to sh 19110 1726882546.02071: Set connection var ansible_connection to ssh 19110 1726882546.02076: Set connection var ansible_pipelining to False 19110 1726882546.02091: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.02093: variable 'ansible_connection' from source: unknown 19110 1726882546.02095: variable 'ansible_module_compression' from source: unknown 19110 1726882546.02098: variable 'ansible_shell_type' from source: unknown 19110 1726882546.02100: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.02102: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.02107: variable 'ansible_pipelining' from source: unknown 19110 1726882546.02110: variable 'ansible_timeout' from source: unknown 19110 1726882546.02112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.02209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882546.02216: variable 'omit' from source: magic vars 19110 1726882546.02222: starting attempt loop 19110 1726882546.02226: running the handler 19110 1726882546.02234: handler run complete 19110 1726882546.02246: attempt loop complete, returning result 19110 1726882546.02249: _execute() done 19110 1726882546.02251: dumping result to json 19110 1726882546.02256: done dumping result, returning 19110 1726882546.02259: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-5372-c19a-000000000007] 19110 1726882546.02267: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000007 19110 1726882546.02338: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000007 19110 1726882546.02341: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 19110 1726882546.02451: no more pending results, returning what we have 19110 1726882546.02453: results queue empty 19110 1726882546.02456: checking for any_errors_fatal 19110 1726882546.02458: done checking for any_errors_fatal 19110 1726882546.02459: checking for max_fail_percentage 19110 1726882546.02460: done checking for max_fail_percentage 19110 1726882546.02460: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.02461: done checking to see if all hosts have failed 19110 1726882546.02461: getting the remaining hosts for this loop 19110 1726882546.02462: done getting the remaining hosts for this loop 19110 1726882546.02466: getting the next task for host managed_node1 19110 1726882546.02469: done getting next task for host managed_node1 19110 1726882546.02470: ^ task is: TASK: meta (flush_handlers) 19110 1726882546.02472: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.02474: getting variables 19110 1726882546.02475: in VariableManager get_vars() 19110 1726882546.02488: Calling all_inventory to load vars for managed_node1 19110 1726882546.02490: Calling groups_inventory to load vars for managed_node1 19110 1726882546.02491: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.02497: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.02499: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.02501: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.02591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.02713: done with get_vars() 19110 1726882546.02719: done getting variables 19110 1726882546.02760: in VariableManager get_vars() 19110 1726882546.02767: Calling all_inventory to load vars for managed_node1 19110 1726882546.02773: Calling groups_inventory to load vars for managed_node1 19110 1726882546.02783: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.02789: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.02791: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.02794: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.02974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.03159: done with get_vars() 19110 1726882546.03175: done queuing things up, now waiting for results queue to drain 19110 1726882546.03176: results queue empty 19110 1726882546.03177: checking for any_errors_fatal 19110 1726882546.03179: done checking for any_errors_fatal 19110 1726882546.03180: checking for max_fail_percentage 19110 1726882546.03181: done checking for max_fail_percentage 19110 1726882546.03181: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.03182: done checking to see if all hosts have failed 19110 1726882546.03183: getting the remaining hosts for this loop 19110 1726882546.03183: done getting the remaining hosts for this loop 19110 1726882546.03185: getting the next task for host managed_node1 19110 1726882546.03188: done getting next task for host managed_node1 19110 1726882546.03190: ^ task is: TASK: meta (flush_handlers) 19110 1726882546.03191: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.03197: getting variables 19110 1726882546.03198: in VariableManager get_vars() 19110 1726882546.03205: Calling all_inventory to load vars for managed_node1 19110 1726882546.03207: Calling groups_inventory to load vars for managed_node1 19110 1726882546.03209: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.03213: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.03215: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.03218: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.03373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.03560: done with get_vars() 19110 1726882546.03582: done getting variables 19110 1726882546.03644: in VariableManager get_vars() 19110 1726882546.03662: Calling all_inventory to load vars for managed_node1 19110 1726882546.03671: Calling groups_inventory to load vars for managed_node1 19110 1726882546.03674: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.03684: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.03692: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.03701: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.03892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.04023: done with get_vars() 19110 1726882546.04031: done queuing things up, now waiting for results queue to drain 19110 1726882546.04032: results queue empty 19110 1726882546.04032: checking for any_errors_fatal 19110 1726882546.04033: done checking for any_errors_fatal 19110 1726882546.04033: checking for max_fail_percentage 19110 1726882546.04034: done checking for max_fail_percentage 19110 1726882546.04034: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.04035: done checking to see if all hosts have failed 19110 1726882546.04035: getting the remaining hosts for this loop 19110 1726882546.04036: done getting the remaining hosts for this loop 19110 1726882546.04037: getting the next task for host managed_node1 19110 1726882546.04039: done getting next task for host managed_node1 19110 1726882546.04039: ^ task is: None 19110 1726882546.04040: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.04041: done queuing things up, now waiting for results queue to drain 19110 1726882546.04041: results queue empty 19110 1726882546.04042: checking for any_errors_fatal 19110 1726882546.04042: done checking for any_errors_fatal 19110 1726882546.04042: checking for max_fail_percentage 19110 1726882546.04043: done checking for max_fail_percentage 19110 1726882546.04043: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.04044: done checking to see if all hosts have failed 19110 1726882546.04045: getting the next task for host managed_node1 19110 1726882546.04046: done getting next task for host managed_node1 19110 1726882546.04047: ^ task is: None 19110 1726882546.04048: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.04082: in VariableManager get_vars() 19110 1726882546.04095: done with get_vars() 19110 1726882546.04099: in VariableManager get_vars() 19110 1726882546.04104: done with get_vars() 19110 1726882546.04107: variable 'omit' from source: magic vars 19110 1726882546.04129: in VariableManager get_vars() 19110 1726882546.04136: done with get_vars() 19110 1726882546.04149: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 19110 1726882546.04259: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882546.04279: getting the remaining hosts for this loop 19110 1726882546.04280: done getting the remaining hosts for this loop 19110 1726882546.04282: getting the next task for host managed_node1 19110 1726882546.04284: done getting next task for host managed_node1 19110 1726882546.04285: ^ task is: TASK: Gathering Facts 19110 1726882546.04286: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.04287: getting variables 19110 1726882546.04287: in VariableManager get_vars() 19110 1726882546.04292: Calling all_inventory to load vars for managed_node1 19110 1726882546.04294: Calling groups_inventory to load vars for managed_node1 19110 1726882546.04295: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.04298: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.04308: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.04311: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.04523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.04624: done with get_vars() 19110 1726882546.04629: done getting variables 19110 1726882546.04656: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 21:35:46 -0400 (0:00:00.034) 0:00:02.903 ****** 19110 1726882546.04672: entering _queue_task() for managed_node1/gather_facts 19110 1726882546.04807: worker is 1 (out of 1 available) 19110 1726882546.04817: exiting _queue_task() for managed_node1/gather_facts 19110 1726882546.04828: done queuing things up, now waiting for results queue to drain 19110 1726882546.04830: waiting for pending results... 19110 1726882546.04961: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882546.05020: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000d8 19110 1726882546.05030: variable 'ansible_search_path' from source: unknown 19110 1726882546.05060: calling self._execute() 19110 1726882546.05109: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.05116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.05123: variable 'omit' from source: magic vars 19110 1726882546.05360: variable 'ansible_distribution_major_version' from source: facts 19110 1726882546.05370: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882546.05376: variable 'omit' from source: magic vars 19110 1726882546.05397: variable 'omit' from source: magic vars 19110 1726882546.05420: variable 'omit' from source: magic vars 19110 1726882546.05448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882546.05476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882546.05491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882546.05506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.05515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.05535: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882546.05538: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.05541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.05607: Set connection var ansible_timeout to 10 19110 1726882546.05618: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882546.05621: Set connection var ansible_shell_executable to /bin/sh 19110 1726882546.05624: Set connection var ansible_shell_type to sh 19110 1726882546.05626: Set connection var ansible_connection to ssh 19110 1726882546.05631: Set connection var ansible_pipelining to False 19110 1726882546.05648: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.05651: variable 'ansible_connection' from source: unknown 19110 1726882546.05654: variable 'ansible_module_compression' from source: unknown 19110 1726882546.05656: variable 'ansible_shell_type' from source: unknown 19110 1726882546.05661: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.05665: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.05670: variable 'ansible_pipelining' from source: unknown 19110 1726882546.05672: variable 'ansible_timeout' from source: unknown 19110 1726882546.05676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.05802: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882546.05811: variable 'omit' from source: magic vars 19110 1726882546.05814: starting attempt loop 19110 1726882546.05817: running the handler 19110 1726882546.05845: variable 'ansible_facts' from source: unknown 19110 1726882546.05867: _low_level_execute_command(): starting 19110 1726882546.05874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882546.06630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.06644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.06667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.06687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.06728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.06740: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.06753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.06777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.06790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.06801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.06814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.06830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.06834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.06914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.06930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.07061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.08718: stdout chunk (state=3): >>>/root <<< 19110 1726882546.08819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.08866: stderr chunk (state=3): >>><<< 19110 1726882546.08872: stdout chunk (state=3): >>><<< 19110 1726882546.08888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.08898: _low_level_execute_command(): starting 19110 1726882546.08904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975 `" && echo ansible-tmp-1726882546.0888813-19235-227909674209975="` echo /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975 `" ) && sleep 0' 19110 1726882546.09363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.09377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.09400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.09432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.09484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.09490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.09593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.11444: stdout chunk (state=3): >>>ansible-tmp-1726882546.0888813-19235-227909674209975=/root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975 <<< 19110 1726882546.11605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.11645: stderr chunk (state=3): >>><<< 19110 1726882546.11648: stdout chunk (state=3): >>><<< 19110 1726882546.11659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882546.0888813-19235-227909674209975=/root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.11682: variable 'ansible_module_compression' from source: unknown 19110 1726882546.11721: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882546.11784: variable 'ansible_facts' from source: unknown 19110 1726882546.11904: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/AnsiballZ_setup.py 19110 1726882546.12014: Sending initial data 19110 1726882546.12017: Sent initial data (154 bytes) 19110 1726882546.12649: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.12653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.12690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882546.12693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.12695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.12750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.12755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.12852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.14607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19110 1726882546.14612: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882546.14682: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882546.14785: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpaootxm7d /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/AnsiballZ_setup.py <<< 19110 1726882546.14887: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882546.17486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.17650: stderr chunk (state=3): >>><<< 19110 1726882546.17656: stdout chunk (state=3): >>><<< 19110 1726882546.17659: done transferring module to remote 19110 1726882546.17662: _low_level_execute_command(): starting 19110 1726882546.17671: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/ /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/AnsiballZ_setup.py && sleep 0' 19110 1726882546.18221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.18238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.18257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.18280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.18322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.18339: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.18358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.18380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.18392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.18405: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.18418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.18436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.18460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.18477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.18490: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882546.18504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.18586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.18603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.18618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.18744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.20460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.20522: stderr chunk (state=3): >>><<< 19110 1726882546.20525: stdout chunk (state=3): >>><<< 19110 1726882546.20620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.20624: _low_level_execute_command(): starting 19110 1726882546.20626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/AnsiballZ_setup.py && sleep 0' 19110 1726882546.21229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.21243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.21261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.21287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.21331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.21343: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.21359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.21382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.21400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.21416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.21429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.21442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.21462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.21478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.21492: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882546.21511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.21593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.21621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.21643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.21774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.72411: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3273, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 704, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239316992, "block_size": 4096, "block_total": 65519355, "block_available": 64511552, "block_used": 1007803, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.46, "5m": 0.39, "15m": 0.21}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "46", "epoch": "1726882546", "epoch_int": "1726882546", "date": "2024-09-20", "time": "21:35:46", "iso8601_micro": "2024-09-21T01:35:46.684144Z", "iso8601": "2024-09-21T01:35:46Z", "iso8601_basic": "20240920T213546684144", "iso8601_basic_short": "20240920T213546", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882546.73917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882546.73973: stderr chunk (state=3): >>><<< 19110 1726882546.73977: stdout chunk (state=3): >>><<< 19110 1726882546.74005: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_is_chroot": false, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3273, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 704, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239316992, "block_size": 4096, "block_total": 65519355, "block_available": 64511552, "block_used": 1007803, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.46, "5m": 0.39, "15m": 0.21}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "46", "epoch": "1726882546", "epoch_int": "1726882546", "date": "2024-09-20", "time": "21:35:46", "iso8601_micro": "2024-09-21T01:35:46.684144Z", "iso8601": "2024-09-21T01:35:46Z", "iso8601_basic": "20240920T213546684144", "iso8601_basic_short": "20240920T213546", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882546.74213: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882546.74230: _low_level_execute_command(): starting 19110 1726882546.74233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882546.0888813-19235-227909674209975/ > /dev/null 2>&1 && sleep 0' 19110 1726882546.74682: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.74688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.74718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.74721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.74724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.74779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.74783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.74884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.76672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.76714: stderr chunk (state=3): >>><<< 19110 1726882546.76718: stdout chunk (state=3): >>><<< 19110 1726882546.76729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.76742: handler run complete 19110 1726882546.76817: variable 'ansible_facts' from source: unknown 19110 1726882546.76883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.77053: variable 'ansible_facts' from source: unknown 19110 1726882546.77110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.77188: attempt loop complete, returning result 19110 1726882546.77191: _execute() done 19110 1726882546.77194: dumping result to json 19110 1726882546.77211: done dumping result, returning 19110 1726882546.77219: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-0000000000d8] 19110 1726882546.77224: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000d8 19110 1726882546.77480: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000d8 19110 1726882546.77483: WORKER PROCESS EXITING ok: [managed_node1] 19110 1726882546.77670: no more pending results, returning what we have 19110 1726882546.77672: results queue empty 19110 1726882546.77673: checking for any_errors_fatal 19110 1726882546.77674: done checking for any_errors_fatal 19110 1726882546.77674: checking for max_fail_percentage 19110 1726882546.77675: done checking for max_fail_percentage 19110 1726882546.77676: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.77676: done checking to see if all hosts have failed 19110 1726882546.77677: getting the remaining hosts for this loop 19110 1726882546.77677: done getting the remaining hosts for this loop 19110 1726882546.77680: getting the next task for host managed_node1 19110 1726882546.77683: done getting next task for host managed_node1 19110 1726882546.77684: ^ task is: TASK: meta (flush_handlers) 19110 1726882546.77686: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.77688: getting variables 19110 1726882546.77689: in VariableManager get_vars() 19110 1726882546.77707: Calling all_inventory to load vars for managed_node1 19110 1726882546.77709: Calling groups_inventory to load vars for managed_node1 19110 1726882546.77711: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.77719: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.77720: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.77722: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.77819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.77945: done with get_vars() 19110 1726882546.77952: done getting variables 19110 1726882546.78000: in VariableManager get_vars() 19110 1726882546.78006: Calling all_inventory to load vars for managed_node1 19110 1726882546.78008: Calling groups_inventory to load vars for managed_node1 19110 1726882546.78009: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.78012: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.78013: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.78015: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.78096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.78202: done with get_vars() 19110 1726882546.78210: done queuing things up, now waiting for results queue to drain 19110 1726882546.78212: results queue empty 19110 1726882546.78212: checking for any_errors_fatal 19110 1726882546.78214: done checking for any_errors_fatal 19110 1726882546.78214: checking for max_fail_percentage 19110 1726882546.78215: done checking for max_fail_percentage 19110 1726882546.78215: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.78219: done checking to see if all hosts have failed 19110 1726882546.78219: getting the remaining hosts for this loop 19110 1726882546.78220: done getting the remaining hosts for this loop 19110 1726882546.78222: getting the next task for host managed_node1 19110 1726882546.78224: done getting next task for host managed_node1 19110 1726882546.78226: ^ task is: TASK: Show inside ethernet tests 19110 1726882546.78227: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.78228: getting variables 19110 1726882546.78228: in VariableManager get_vars() 19110 1726882546.78233: Calling all_inventory to load vars for managed_node1 19110 1726882546.78234: Calling groups_inventory to load vars for managed_node1 19110 1726882546.78236: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.78239: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.78240: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.78241: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.78321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.78425: done with get_vars() 19110 1726882546.78430: done getting variables 19110 1726882546.78487: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 21:35:46 -0400 (0:00:00.738) 0:00:03.642 ****** 19110 1726882546.78506: entering _queue_task() for managed_node1/debug 19110 1726882546.78507: Creating lock for debug 19110 1726882546.78689: worker is 1 (out of 1 available) 19110 1726882546.78702: exiting _queue_task() for managed_node1/debug 19110 1726882546.78713: done queuing things up, now waiting for results queue to drain 19110 1726882546.78715: waiting for pending results... 19110 1726882546.78877: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 19110 1726882546.78954: in run() - task 0e448fcc-3ce9-5372-c19a-00000000000b 19110 1726882546.78968: variable 'ansible_search_path' from source: unknown 19110 1726882546.78997: calling self._execute() 19110 1726882546.79074: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.79080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.79088: variable 'omit' from source: magic vars 19110 1726882546.79415: variable 'ansible_distribution_major_version' from source: facts 19110 1726882546.79425: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882546.79430: variable 'omit' from source: magic vars 19110 1726882546.79456: variable 'omit' from source: magic vars 19110 1726882546.79479: variable 'omit' from source: magic vars 19110 1726882546.79511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882546.79536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882546.79551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882546.79571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.79579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.79602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882546.79606: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.79608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.79678: Set connection var ansible_timeout to 10 19110 1726882546.79687: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882546.79693: Set connection var ansible_shell_executable to /bin/sh 19110 1726882546.79696: Set connection var ansible_shell_type to sh 19110 1726882546.79698: Set connection var ansible_connection to ssh 19110 1726882546.79703: Set connection var ansible_pipelining to False 19110 1726882546.79721: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.79724: variable 'ansible_connection' from source: unknown 19110 1726882546.79726: variable 'ansible_module_compression' from source: unknown 19110 1726882546.79729: variable 'ansible_shell_type' from source: unknown 19110 1726882546.79731: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.79733: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.79735: variable 'ansible_pipelining' from source: unknown 19110 1726882546.79738: variable 'ansible_timeout' from source: unknown 19110 1726882546.79742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.79844: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882546.79852: variable 'omit' from source: magic vars 19110 1726882546.79860: starting attempt loop 19110 1726882546.79865: running the handler 19110 1726882546.79901: handler run complete 19110 1726882546.79917: attempt loop complete, returning result 19110 1726882546.79921: _execute() done 19110 1726882546.79923: dumping result to json 19110 1726882546.79925: done dumping result, returning 19110 1726882546.79930: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [0e448fcc-3ce9-5372-c19a-00000000000b] 19110 1726882546.79936: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000b 19110 1726882546.80021: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000b 19110 1726882546.80024: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Inside ethernet tests 19110 1726882546.80078: no more pending results, returning what we have 19110 1726882546.80081: results queue empty 19110 1726882546.80081: checking for any_errors_fatal 19110 1726882546.80083: done checking for any_errors_fatal 19110 1726882546.80084: checking for max_fail_percentage 19110 1726882546.80085: done checking for max_fail_percentage 19110 1726882546.80085: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.80086: done checking to see if all hosts have failed 19110 1726882546.80087: getting the remaining hosts for this loop 19110 1726882546.80088: done getting the remaining hosts for this loop 19110 1726882546.80091: getting the next task for host managed_node1 19110 1726882546.80095: done getting next task for host managed_node1 19110 1726882546.80097: ^ task is: TASK: Show network_provider 19110 1726882546.80105: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.80108: getting variables 19110 1726882546.80110: in VariableManager get_vars() 19110 1726882546.80133: Calling all_inventory to load vars for managed_node1 19110 1726882546.80135: Calling groups_inventory to load vars for managed_node1 19110 1726882546.80138: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.80145: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.80147: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.80149: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.80288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.80395: done with get_vars() 19110 1726882546.80401: done getting variables 19110 1726882546.80440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 21:35:46 -0400 (0:00:00.019) 0:00:03.661 ****** 19110 1726882546.80458: entering _queue_task() for managed_node1/debug 19110 1726882546.80696: worker is 1 (out of 1 available) 19110 1726882546.80707: exiting _queue_task() for managed_node1/debug 19110 1726882546.80718: done queuing things up, now waiting for results queue to drain 19110 1726882546.80720: waiting for pending results... 19110 1726882546.80991: running TaskExecutor() for managed_node1/TASK: Show network_provider 19110 1726882546.81087: in run() - task 0e448fcc-3ce9-5372-c19a-00000000000c 19110 1726882546.81110: variable 'ansible_search_path' from source: unknown 19110 1726882546.81148: calling self._execute() 19110 1726882546.81229: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.81239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.81251: variable 'omit' from source: magic vars 19110 1726882546.81646: variable 'ansible_distribution_major_version' from source: facts 19110 1726882546.81669: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882546.81679: variable 'omit' from source: magic vars 19110 1726882546.81715: variable 'omit' from source: magic vars 19110 1726882546.81777: variable 'omit' from source: magic vars 19110 1726882546.81816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882546.81841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882546.81862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882546.81890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.81907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.81929: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882546.81932: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.81936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.82006: Set connection var ansible_timeout to 10 19110 1726882546.82015: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882546.82020: Set connection var ansible_shell_executable to /bin/sh 19110 1726882546.82022: Set connection var ansible_shell_type to sh 19110 1726882546.82026: Set connection var ansible_connection to ssh 19110 1726882546.82031: Set connection var ansible_pipelining to False 19110 1726882546.82048: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.82059: variable 'ansible_connection' from source: unknown 19110 1726882546.82062: variable 'ansible_module_compression' from source: unknown 19110 1726882546.82067: variable 'ansible_shell_type' from source: unknown 19110 1726882546.82070: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.82072: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.82074: variable 'ansible_pipelining' from source: unknown 19110 1726882546.82076: variable 'ansible_timeout' from source: unknown 19110 1726882546.82078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.82177: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882546.82185: variable 'omit' from source: magic vars 19110 1726882546.82190: starting attempt loop 19110 1726882546.82193: running the handler 19110 1726882546.82229: variable 'network_provider' from source: set_fact 19110 1726882546.82284: variable 'network_provider' from source: set_fact 19110 1726882546.82301: handler run complete 19110 1726882546.82314: attempt loop complete, returning result 19110 1726882546.82321: _execute() done 19110 1726882546.82324: dumping result to json 19110 1726882546.82326: done dumping result, returning 19110 1726882546.82333: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0e448fcc-3ce9-5372-c19a-00000000000c] 19110 1726882546.82338: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000c 19110 1726882546.82416: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000c 19110 1726882546.82418: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 19110 1726882546.82495: no more pending results, returning what we have 19110 1726882546.82497: results queue empty 19110 1726882546.82498: checking for any_errors_fatal 19110 1726882546.82502: done checking for any_errors_fatal 19110 1726882546.82503: checking for max_fail_percentage 19110 1726882546.82504: done checking for max_fail_percentage 19110 1726882546.82505: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.82505: done checking to see if all hosts have failed 19110 1726882546.82506: getting the remaining hosts for this loop 19110 1726882546.82507: done getting the remaining hosts for this loop 19110 1726882546.82510: getting the next task for host managed_node1 19110 1726882546.82515: done getting next task for host managed_node1 19110 1726882546.82517: ^ task is: TASK: meta (flush_handlers) 19110 1726882546.82518: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.82520: getting variables 19110 1726882546.82521: in VariableManager get_vars() 19110 1726882546.82543: Calling all_inventory to load vars for managed_node1 19110 1726882546.82544: Calling groups_inventory to load vars for managed_node1 19110 1726882546.82546: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.82552: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.82554: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.82556: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.82657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.82774: done with get_vars() 19110 1726882546.82780: done getting variables 19110 1726882546.82821: in VariableManager get_vars() 19110 1726882546.82827: Calling all_inventory to load vars for managed_node1 19110 1726882546.82828: Calling groups_inventory to load vars for managed_node1 19110 1726882546.82829: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.82832: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.82834: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.82835: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.82932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.83041: done with get_vars() 19110 1726882546.83049: done queuing things up, now waiting for results queue to drain 19110 1726882546.83050: results queue empty 19110 1726882546.83051: checking for any_errors_fatal 19110 1726882546.83052: done checking for any_errors_fatal 19110 1726882546.83052: checking for max_fail_percentage 19110 1726882546.83053: done checking for max_fail_percentage 19110 1726882546.83054: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.83054: done checking to see if all hosts have failed 19110 1726882546.83055: getting the remaining hosts for this loop 19110 1726882546.83056: done getting the remaining hosts for this loop 19110 1726882546.83057: getting the next task for host managed_node1 19110 1726882546.83065: done getting next task for host managed_node1 19110 1726882546.83066: ^ task is: TASK: meta (flush_handlers) 19110 1726882546.83067: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.83069: getting variables 19110 1726882546.83069: in VariableManager get_vars() 19110 1726882546.83074: Calling all_inventory to load vars for managed_node1 19110 1726882546.83076: Calling groups_inventory to load vars for managed_node1 19110 1726882546.83078: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.83082: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.83083: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.83085: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.83159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.83263: done with get_vars() 19110 1726882546.83269: done getting variables 19110 1726882546.83298: in VariableManager get_vars() 19110 1726882546.83304: Calling all_inventory to load vars for managed_node1 19110 1726882546.83305: Calling groups_inventory to load vars for managed_node1 19110 1726882546.83307: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.83309: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.83311: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.83312: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.83389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.83508: done with get_vars() 19110 1726882546.83517: done queuing things up, now waiting for results queue to drain 19110 1726882546.83518: results queue empty 19110 1726882546.83519: checking for any_errors_fatal 19110 1726882546.83520: done checking for any_errors_fatal 19110 1726882546.83520: checking for max_fail_percentage 19110 1726882546.83521: done checking for max_fail_percentage 19110 1726882546.83521: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.83521: done checking to see if all hosts have failed 19110 1726882546.83522: getting the remaining hosts for this loop 19110 1726882546.83522: done getting the remaining hosts for this loop 19110 1726882546.83524: getting the next task for host managed_node1 19110 1726882546.83525: done getting next task for host managed_node1 19110 1726882546.83526: ^ task is: None 19110 1726882546.83527: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.83528: done queuing things up, now waiting for results queue to drain 19110 1726882546.83528: results queue empty 19110 1726882546.83528: checking for any_errors_fatal 19110 1726882546.83529: done checking for any_errors_fatal 19110 1726882546.83529: checking for max_fail_percentage 19110 1726882546.83530: done checking for max_fail_percentage 19110 1726882546.83530: checking to see if all hosts have failed and the running result is not ok 19110 1726882546.83531: done checking to see if all hosts have failed 19110 1726882546.83532: getting the next task for host managed_node1 19110 1726882546.83533: done getting next task for host managed_node1 19110 1726882546.83533: ^ task is: None 19110 1726882546.83534: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.83565: in VariableManager get_vars() 19110 1726882546.83575: done with get_vars() 19110 1726882546.83578: in VariableManager get_vars() 19110 1726882546.83584: done with get_vars() 19110 1726882546.83586: variable 'omit' from source: magic vars 19110 1726882546.83604: in VariableManager get_vars() 19110 1726882546.83610: done with get_vars() 19110 1726882546.83623: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 19110 1726882546.83837: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882546.83869: getting the remaining hosts for this loop 19110 1726882546.83871: done getting the remaining hosts for this loop 19110 1726882546.83873: getting the next task for host managed_node1 19110 1726882546.83876: done getting next task for host managed_node1 19110 1726882546.83877: ^ task is: TASK: Gathering Facts 19110 1726882546.83879: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882546.83880: getting variables 19110 1726882546.83881: in VariableManager get_vars() 19110 1726882546.83888: Calling all_inventory to load vars for managed_node1 19110 1726882546.83890: Calling groups_inventory to load vars for managed_node1 19110 1726882546.83892: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882546.83895: Calling all_plugins_play to load vars for managed_node1 19110 1726882546.83897: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882546.83899: Calling groups_plugins_play to load vars for managed_node1 19110 1726882546.84008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882546.84184: done with get_vars() 19110 1726882546.84191: done getting variables 19110 1726882546.84221: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 21:35:46 -0400 (0:00:00.037) 0:00:03.699 ****** 19110 1726882546.84240: entering _queue_task() for managed_node1/gather_facts 19110 1726882546.84444: worker is 1 (out of 1 available) 19110 1726882546.84457: exiting _queue_task() for managed_node1/gather_facts 19110 1726882546.84471: done queuing things up, now waiting for results queue to drain 19110 1726882546.84472: waiting for pending results... 19110 1726882546.84708: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882546.84794: in run() - task 0e448fcc-3ce9-5372-c19a-0000000000f0 19110 1726882546.84817: variable 'ansible_search_path' from source: unknown 19110 1726882546.84853: calling self._execute() 19110 1726882546.84926: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.84937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.84950: variable 'omit' from source: magic vars 19110 1726882546.85291: variable 'ansible_distribution_major_version' from source: facts 19110 1726882546.85308: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882546.85318: variable 'omit' from source: magic vars 19110 1726882546.85349: variable 'omit' from source: magic vars 19110 1726882546.85389: variable 'omit' from source: magic vars 19110 1726882546.85431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882546.85479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882546.85506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882546.85530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.85546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882546.85583: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882546.85591: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.85598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.85696: Set connection var ansible_timeout to 10 19110 1726882546.85713: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882546.85722: Set connection var ansible_shell_executable to /bin/sh 19110 1726882546.85728: Set connection var ansible_shell_type to sh 19110 1726882546.85733: Set connection var ansible_connection to ssh 19110 1726882546.85741: Set connection var ansible_pipelining to False 19110 1726882546.85766: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.85776: variable 'ansible_connection' from source: unknown 19110 1726882546.85784: variable 'ansible_module_compression' from source: unknown 19110 1726882546.85790: variable 'ansible_shell_type' from source: unknown 19110 1726882546.85796: variable 'ansible_shell_executable' from source: unknown 19110 1726882546.85802: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882546.85809: variable 'ansible_pipelining' from source: unknown 19110 1726882546.85815: variable 'ansible_timeout' from source: unknown 19110 1726882546.85822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882546.86004: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882546.86019: variable 'omit' from source: magic vars 19110 1726882546.86028: starting attempt loop 19110 1726882546.86034: running the handler 19110 1726882546.86052: variable 'ansible_facts' from source: unknown 19110 1726882546.86076: _low_level_execute_command(): starting 19110 1726882546.86087: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882546.86862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.86882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.86898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.86916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.86957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.86972: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.86990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.87008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.87020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.87030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.87042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.87056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.87075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.87091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.87102: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882546.87115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.87185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.87213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.87231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.87356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.88934: stdout chunk (state=3): >>>/root <<< 19110 1726882546.89034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.89126: stderr chunk (state=3): >>><<< 19110 1726882546.89138: stdout chunk (state=3): >>><<< 19110 1726882546.89267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.89271: _low_level_execute_command(): starting 19110 1726882546.89274: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801 `" && echo ansible-tmp-1726882546.8917003-19269-31195244832801="` echo /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801 `" ) && sleep 0' 19110 1726882546.89894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.89909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.89929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.89945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.89989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.90000: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.90013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.90038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.90050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.90061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.90076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.90088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.90102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.90113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.90123: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882546.90142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.90219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.90244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.90260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.90383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.92226: stdout chunk (state=3): >>>ansible-tmp-1726882546.8917003-19269-31195244832801=/root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801 <<< 19110 1726882546.92385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.92425: stderr chunk (state=3): >>><<< 19110 1726882546.92428: stdout chunk (state=3): >>><<< 19110 1726882546.92570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882546.8917003-19269-31195244832801=/root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882546.92574: variable 'ansible_module_compression' from source: unknown 19110 1726882546.92576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882546.92693: variable 'ansible_facts' from source: unknown 19110 1726882546.92772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/AnsiballZ_setup.py 19110 1726882546.93042: Sending initial data 19110 1726882546.93045: Sent initial data (153 bytes) 19110 1726882546.94048: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882546.94061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.94080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.94097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.94144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.94157: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882546.94174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.94190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882546.94201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882546.94217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882546.94229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882546.94243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882546.94259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882546.94274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882546.94284: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882546.94296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882546.94379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882546.94400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882546.94415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882546.94536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882546.96267: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882546.96356: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882546.96445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpeqy0j8f8 /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/AnsiballZ_setup.py <<< 19110 1726882546.96536: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882546.99210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882546.99369: stderr chunk (state=3): >>><<< 19110 1726882546.99372: stdout chunk (state=3): >>><<< 19110 1726882546.99374: done transferring module to remote 19110 1726882546.99376: _low_level_execute_command(): starting 19110 1726882546.99382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/ /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/AnsiballZ_setup.py && sleep 0' 19110 1726882547.00000: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882547.00012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.00026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.00047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.00091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.00102: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882547.00115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.00131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882547.00142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882547.00157: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882547.00171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.00184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.00198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.00209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.00219: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882547.00230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.00312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.00334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882547.00350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.00471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.02268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.02278: stdout chunk (state=3): >>><<< 19110 1726882547.02285: stderr chunk (state=3): >>><<< 19110 1726882547.02301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882547.02304: _low_level_execute_command(): starting 19110 1726882547.02309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/AnsiballZ_setup.py && sleep 0' 19110 1726882547.03073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.03077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.03117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882547.03120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.03122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882547.03124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.03190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.03194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.03307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.53894: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.41, "15m": 0.22}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2805, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 727, "free": 2805}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_<<< 19110 1726882547.53907: stdout chunk (state=3): >>>chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 705, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239337472, "block_size": 4096, "block_total": 65519355, "block_available": 64511557, "block_used": 1007798, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "47", "epoch": "1726882547", "epoch_int": "1726882547", "date": "2024-09-20", "time": "21:35:47", "iso8601_micro": "2024-09-21T01:35:47.535103Z", "iso8601": "2024-09-21T01:35:47Z", "iso8601_basic": "20240920T213547535103", "iso8601_basic_short": "20240920T213547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882547.55573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882547.55577: stdout chunk (state=3): >>><<< 19110 1726882547.55592: stderr chunk (state=3): >>><<< 19110 1726882547.55776: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5, "5m": 0.41, "15m": 0.22}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2805, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 727, "free": 2805}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 705, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239337472, "block_size": 4096, "block_total": 65519355, "block_available": 64511557, "block_used": 1007798, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "47", "epoch": "1726882547", "epoch_int": "1726882547", "date": "2024-09-20", "time": "21:35:47", "iso8601_micro": "2024-09-21T01:35:47.535103Z", "iso8601": "2024-09-21T01:35:47Z", "iso8601_basic": "20240920T213547535103", "iso8601_basic_short": "20240920T213547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882547.55994: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882547.56019: _low_level_execute_command(): starting 19110 1726882547.56028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882546.8917003-19269-31195244832801/ > /dev/null 2>&1 && sleep 0' 19110 1726882547.56648: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882547.56668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.56685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.56704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.56746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.56769: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882547.56784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.56802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882547.56814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882547.56826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882547.56837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.56851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.56872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.56887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.56898: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882547.56911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.57011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.57034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882547.57051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.57180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.59066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.59089: stdout chunk (state=3): >>><<< 19110 1726882547.59092: stderr chunk (state=3): >>><<< 19110 1726882547.59239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882547.59243: handler run complete 19110 1726882547.59347: variable 'ansible_facts' from source: unknown 19110 1726882547.59367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.59692: variable 'ansible_facts' from source: unknown 19110 1726882547.59784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.59918: attempt loop complete, returning result 19110 1726882547.59928: _execute() done 19110 1726882547.59935: dumping result to json 19110 1726882547.59970: done dumping result, returning 19110 1726882547.59984: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-0000000000f0] 19110 1726882547.59999: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000f0 ok: [managed_node1] 19110 1726882547.60616: no more pending results, returning what we have 19110 1726882547.60619: results queue empty 19110 1726882547.60620: checking for any_errors_fatal 19110 1726882547.60622: done checking for any_errors_fatal 19110 1726882547.60623: checking for max_fail_percentage 19110 1726882547.60624: done checking for max_fail_percentage 19110 1726882547.60625: checking to see if all hosts have failed and the running result is not ok 19110 1726882547.60626: done checking to see if all hosts have failed 19110 1726882547.60626: getting the remaining hosts for this loop 19110 1726882547.60628: done getting the remaining hosts for this loop 19110 1726882547.60632: getting the next task for host managed_node1 19110 1726882547.60638: done getting next task for host managed_node1 19110 1726882547.60641: ^ task is: TASK: meta (flush_handlers) 19110 1726882547.60643: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882547.60647: getting variables 19110 1726882547.60649: in VariableManager get_vars() 19110 1726882547.60677: Calling all_inventory to load vars for managed_node1 19110 1726882547.60680: Calling groups_inventory to load vars for managed_node1 19110 1726882547.60684: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.60695: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.60698: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.60701: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.60857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.61278: done with get_vars() 19110 1726882547.61288: done getting variables 19110 1726882547.61349: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000000f0 19110 1726882547.61351: WORKER PROCESS EXITING 19110 1726882547.61397: in VariableManager get_vars() 19110 1726882547.61405: Calling all_inventory to load vars for managed_node1 19110 1726882547.61407: Calling groups_inventory to load vars for managed_node1 19110 1726882547.61410: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.61414: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.61416: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.61422: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.61711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.61899: done with get_vars() 19110 1726882547.61913: done queuing things up, now waiting for results queue to drain 19110 1726882547.61915: results queue empty 19110 1726882547.61915: checking for any_errors_fatal 19110 1726882547.61918: done checking for any_errors_fatal 19110 1726882547.61919: checking for max_fail_percentage 19110 1726882547.61920: done checking for max_fail_percentage 19110 1726882547.61921: checking to see if all hosts have failed and the running result is not ok 19110 1726882547.61922: done checking to see if all hosts have failed 19110 1726882547.61922: getting the remaining hosts for this loop 19110 1726882547.61923: done getting the remaining hosts for this loop 19110 1726882547.61934: getting the next task for host managed_node1 19110 1726882547.61938: done getting next task for host managed_node1 19110 1726882547.61940: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 19110 1726882547.61945: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882547.61948: getting variables 19110 1726882547.61949: in VariableManager get_vars() 19110 1726882547.61957: Calling all_inventory to load vars for managed_node1 19110 1726882547.61967: Calling groups_inventory to load vars for managed_node1 19110 1726882547.61970: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.61974: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.61977: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.61980: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.62132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.62348: done with get_vars() 19110 1726882547.62361: done getting variables 19110 1726882547.62400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882547.62544: variable 'type' from source: play vars 19110 1726882547.62550: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 21:35:47 -0400 (0:00:00.783) 0:00:04.483 ****** 19110 1726882547.62599: entering _queue_task() for managed_node1/set_fact 19110 1726882547.62880: worker is 1 (out of 1 available) 19110 1726882547.62891: exiting _queue_task() for managed_node1/set_fact 19110 1726882547.62906: done queuing things up, now waiting for results queue to drain 19110 1726882547.62908: waiting for pending results... 19110 1726882547.63172: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 19110 1726882547.63274: in run() - task 0e448fcc-3ce9-5372-c19a-00000000000f 19110 1726882547.63293: variable 'ansible_search_path' from source: unknown 19110 1726882547.63331: calling self._execute() 19110 1726882547.63497: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.63504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.63523: variable 'omit' from source: magic vars 19110 1726882547.63815: variable 'ansible_distribution_major_version' from source: facts 19110 1726882547.63835: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882547.63845: variable 'omit' from source: magic vars 19110 1726882547.63878: variable 'omit' from source: magic vars 19110 1726882547.63908: variable 'type' from source: play vars 19110 1726882547.63982: variable 'type' from source: play vars 19110 1726882547.63995: variable 'interface' from source: play vars 19110 1726882547.64058: variable 'interface' from source: play vars 19110 1726882547.64082: variable 'omit' from source: magic vars 19110 1726882547.64110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882547.64147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882547.64176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882547.64200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882547.64217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882547.64252: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882547.64262: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.64275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.64371: Set connection var ansible_timeout to 10 19110 1726882547.64390: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882547.64401: Set connection var ansible_shell_executable to /bin/sh 19110 1726882547.64408: Set connection var ansible_shell_type to sh 19110 1726882547.64413: Set connection var ansible_connection to ssh 19110 1726882547.64422: Set connection var ansible_pipelining to False 19110 1726882547.64445: variable 'ansible_shell_executable' from source: unknown 19110 1726882547.64452: variable 'ansible_connection' from source: unknown 19110 1726882547.64458: variable 'ansible_module_compression' from source: unknown 19110 1726882547.64465: variable 'ansible_shell_type' from source: unknown 19110 1726882547.64473: variable 'ansible_shell_executable' from source: unknown 19110 1726882547.64480: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.64487: variable 'ansible_pipelining' from source: unknown 19110 1726882547.64494: variable 'ansible_timeout' from source: unknown 19110 1726882547.64502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.64635: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882547.64649: variable 'omit' from source: magic vars 19110 1726882547.64660: starting attempt loop 19110 1726882547.64670: running the handler 19110 1726882547.64687: handler run complete 19110 1726882547.64703: attempt loop complete, returning result 19110 1726882547.64710: _execute() done 19110 1726882547.64717: dumping result to json 19110 1726882547.64725: done dumping result, returning 19110 1726882547.64735: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [0e448fcc-3ce9-5372-c19a-00000000000f] 19110 1726882547.64746: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000f ok: [managed_node1] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 19110 1726882547.64952: no more pending results, returning what we have 19110 1726882547.64957: results queue empty 19110 1726882547.64958: checking for any_errors_fatal 19110 1726882547.64959: done checking for any_errors_fatal 19110 1726882547.64959: checking for max_fail_percentage 19110 1726882547.64961: done checking for max_fail_percentage 19110 1726882547.64961: checking to see if all hosts have failed and the running result is not ok 19110 1726882547.64962: done checking to see if all hosts have failed 19110 1726882547.64962: getting the remaining hosts for this loop 19110 1726882547.64965: done getting the remaining hosts for this loop 19110 1726882547.64969: getting the next task for host managed_node1 19110 1726882547.64974: done getting next task for host managed_node1 19110 1726882547.64976: ^ task is: TASK: Include the task 'show_interfaces.yml' 19110 1726882547.64978: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882547.64981: getting variables 19110 1726882547.64983: in VariableManager get_vars() 19110 1726882547.65005: Calling all_inventory to load vars for managed_node1 19110 1726882547.65007: Calling groups_inventory to load vars for managed_node1 19110 1726882547.65010: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.65021: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.65023: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.65027: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.65198: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000000f 19110 1726882547.65201: WORKER PROCESS EXITING 19110 1726882547.65223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.65381: done with get_vars() 19110 1726882547.65387: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 21:35:47 -0400 (0:00:00.028) 0:00:04.511 ****** 19110 1726882547.65473: entering _queue_task() for managed_node1/include_tasks 19110 1726882547.65695: worker is 1 (out of 1 available) 19110 1726882547.65706: exiting _queue_task() for managed_node1/include_tasks 19110 1726882547.65722: done queuing things up, now waiting for results queue to drain 19110 1726882547.65727: waiting for pending results... 19110 1726882547.65923: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 19110 1726882547.66024: in run() - task 0e448fcc-3ce9-5372-c19a-000000000010 19110 1726882547.66047: variable 'ansible_search_path' from source: unknown 19110 1726882547.66092: calling self._execute() 19110 1726882547.66179: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.66189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.66201: variable 'omit' from source: magic vars 19110 1726882547.66595: variable 'ansible_distribution_major_version' from source: facts 19110 1726882547.66618: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882547.66629: _execute() done 19110 1726882547.66636: dumping result to json 19110 1726882547.66647: done dumping result, returning 19110 1726882547.66658: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-5372-c19a-000000000010] 19110 1726882547.66672: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000010 19110 1726882547.66778: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000010 19110 1726882547.66784: WORKER PROCESS EXITING 19110 1726882547.66817: no more pending results, returning what we have 19110 1726882547.66823: in VariableManager get_vars() 19110 1726882547.66858: Calling all_inventory to load vars for managed_node1 19110 1726882547.66865: Calling groups_inventory to load vars for managed_node1 19110 1726882547.66869: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.66882: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.66885: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.66887: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.67072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.67968: done with get_vars() 19110 1726882547.67975: variable 'ansible_search_path' from source: unknown 19110 1726882547.67987: we have included files to process 19110 1726882547.67988: generating all_blocks data 19110 1726882547.67989: done generating all_blocks data 19110 1726882547.67994: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882547.67995: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882547.67997: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882547.68192: in VariableManager get_vars() 19110 1726882547.68214: done with get_vars() 19110 1726882547.68322: done processing included file 19110 1726882547.68324: iterating over new_blocks loaded from include file 19110 1726882547.68325: in VariableManager get_vars() 19110 1726882547.68336: done with get_vars() 19110 1726882547.68337: filtering new block on tags 19110 1726882547.68354: done filtering new block on tags 19110 1726882547.68356: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 19110 1726882547.68360: extending task lists for all hosts with included blocks 19110 1726882547.68455: done extending task lists 19110 1726882547.68456: done processing included files 19110 1726882547.68457: results queue empty 19110 1726882547.68458: checking for any_errors_fatal 19110 1726882547.68466: done checking for any_errors_fatal 19110 1726882547.68467: checking for max_fail_percentage 19110 1726882547.68468: done checking for max_fail_percentage 19110 1726882547.68468: checking to see if all hosts have failed and the running result is not ok 19110 1726882547.68469: done checking to see if all hosts have failed 19110 1726882547.68470: getting the remaining hosts for this loop 19110 1726882547.68471: done getting the remaining hosts for this loop 19110 1726882547.68474: getting the next task for host managed_node1 19110 1726882547.68477: done getting next task for host managed_node1 19110 1726882547.68479: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 19110 1726882547.68482: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882547.68484: getting variables 19110 1726882547.68485: in VariableManager get_vars() 19110 1726882547.68493: Calling all_inventory to load vars for managed_node1 19110 1726882547.68495: Calling groups_inventory to load vars for managed_node1 19110 1726882547.68497: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.68502: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.68504: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.68507: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.68656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.68831: done with get_vars() 19110 1726882547.68839: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:47 -0400 (0:00:00.034) 0:00:04.546 ****** 19110 1726882547.68912: entering _queue_task() for managed_node1/include_tasks 19110 1726882547.69134: worker is 1 (out of 1 available) 19110 1726882547.69147: exiting _queue_task() for managed_node1/include_tasks 19110 1726882547.69158: done queuing things up, now waiting for results queue to drain 19110 1726882547.69159: waiting for pending results... 19110 1726882547.69886: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 19110 1726882547.70003: in run() - task 0e448fcc-3ce9-5372-c19a-000000000104 19110 1726882547.70023: variable 'ansible_search_path' from source: unknown 19110 1726882547.70030: variable 'ansible_search_path' from source: unknown 19110 1726882547.70081: calling self._execute() 19110 1726882547.70165: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.70180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.70195: variable 'omit' from source: magic vars 19110 1726882547.70545: variable 'ansible_distribution_major_version' from source: facts 19110 1726882547.70554: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882547.70562: _execute() done 19110 1726882547.70568: dumping result to json 19110 1726882547.70570: done dumping result, returning 19110 1726882547.70573: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-5372-c19a-000000000104] 19110 1726882547.70580: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000104 19110 1726882547.70657: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000104 19110 1726882547.70660: WORKER PROCESS EXITING 19110 1726882547.70686: no more pending results, returning what we have 19110 1726882547.70692: in VariableManager get_vars() 19110 1726882547.70723: Calling all_inventory to load vars for managed_node1 19110 1726882547.70726: Calling groups_inventory to load vars for managed_node1 19110 1726882547.70728: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.70737: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.70740: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.70742: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.70882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.70987: done with get_vars() 19110 1726882547.70993: variable 'ansible_search_path' from source: unknown 19110 1726882547.70994: variable 'ansible_search_path' from source: unknown 19110 1726882547.71018: we have included files to process 19110 1726882547.71019: generating all_blocks data 19110 1726882547.71020: done generating all_blocks data 19110 1726882547.71020: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882547.71021: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882547.71022: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882547.71220: done processing included file 19110 1726882547.71221: iterating over new_blocks loaded from include file 19110 1726882547.71222: in VariableManager get_vars() 19110 1726882547.71230: done with get_vars() 19110 1726882547.71231: filtering new block on tags 19110 1726882547.71241: done filtering new block on tags 19110 1726882547.71242: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 19110 1726882547.71245: extending task lists for all hosts with included blocks 19110 1726882547.71304: done extending task lists 19110 1726882547.71305: done processing included files 19110 1726882547.71305: results queue empty 19110 1726882547.71306: checking for any_errors_fatal 19110 1726882547.71307: done checking for any_errors_fatal 19110 1726882547.71308: checking for max_fail_percentage 19110 1726882547.71308: done checking for max_fail_percentage 19110 1726882547.71309: checking to see if all hosts have failed and the running result is not ok 19110 1726882547.71309: done checking to see if all hosts have failed 19110 1726882547.71310: getting the remaining hosts for this loop 19110 1726882547.71311: done getting the remaining hosts for this loop 19110 1726882547.71312: getting the next task for host managed_node1 19110 1726882547.71315: done getting next task for host managed_node1 19110 1726882547.71317: ^ task is: TASK: Gather current interface info 19110 1726882547.71319: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882547.71321: getting variables 19110 1726882547.71322: in VariableManager get_vars() 19110 1726882547.71327: Calling all_inventory to load vars for managed_node1 19110 1726882547.71328: Calling groups_inventory to load vars for managed_node1 19110 1726882547.71330: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882547.71333: Calling all_plugins_play to load vars for managed_node1 19110 1726882547.71335: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882547.71336: Calling groups_plugins_play to load vars for managed_node1 19110 1726882547.71415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882547.71560: done with get_vars() 19110 1726882547.71570: done getting variables 19110 1726882547.71595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:47 -0400 (0:00:00.027) 0:00:04.573 ****** 19110 1726882547.71613: entering _queue_task() for managed_node1/command 19110 1726882547.71819: worker is 1 (out of 1 available) 19110 1726882547.71856: exiting _queue_task() for managed_node1/command 19110 1726882547.71869: done queuing things up, now waiting for results queue to drain 19110 1726882547.71871: waiting for pending results... 19110 1726882547.72357: running TaskExecutor() for managed_node1/TASK: Gather current interface info 19110 1726882547.72474: in run() - task 0e448fcc-3ce9-5372-c19a-000000000115 19110 1726882547.72494: variable 'ansible_search_path' from source: unknown 19110 1726882547.72507: variable 'ansible_search_path' from source: unknown 19110 1726882547.72544: calling self._execute() 19110 1726882547.72629: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.72639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.72651: variable 'omit' from source: magic vars 19110 1726882547.73541: variable 'ansible_distribution_major_version' from source: facts 19110 1726882547.73563: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882547.73577: variable 'omit' from source: magic vars 19110 1726882547.73622: variable 'omit' from source: magic vars 19110 1726882547.73785: variable 'omit' from source: magic vars 19110 1726882547.73905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882547.73948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882547.74111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882547.74212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882547.74262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882547.74485: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882547.74522: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.74535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.74761: Set connection var ansible_timeout to 10 19110 1726882547.74766: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882547.74769: Set connection var ansible_shell_executable to /bin/sh 19110 1726882547.74771: Set connection var ansible_shell_type to sh 19110 1726882547.74773: Set connection var ansible_connection to ssh 19110 1726882547.74775: Set connection var ansible_pipelining to False 19110 1726882547.74780: variable 'ansible_shell_executable' from source: unknown 19110 1726882547.74817: variable 'ansible_connection' from source: unknown 19110 1726882547.74820: variable 'ansible_module_compression' from source: unknown 19110 1726882547.74823: variable 'ansible_shell_type' from source: unknown 19110 1726882547.74825: variable 'ansible_shell_executable' from source: unknown 19110 1726882547.74827: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882547.74830: variable 'ansible_pipelining' from source: unknown 19110 1726882547.74832: variable 'ansible_timeout' from source: unknown 19110 1726882547.74834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882547.75280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882547.75289: variable 'omit' from source: magic vars 19110 1726882547.75299: starting attempt loop 19110 1726882547.75302: running the handler 19110 1726882547.75315: _low_level_execute_command(): starting 19110 1726882547.75323: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882547.76035: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882547.76048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.76068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.76085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.76121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.76129: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882547.76139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.76152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882547.76166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882547.76175: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882547.76184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.76199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.76212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.76219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882547.76225: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882547.76235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.76319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.76333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882547.76344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.76472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.78140: stdout chunk (state=3): >>>/root <<< 19110 1726882547.78276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.78285: stderr chunk (state=3): >>><<< 19110 1726882547.78288: stdout chunk (state=3): >>><<< 19110 1726882547.78304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882547.78314: _low_level_execute_command(): starting 19110 1726882547.78320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080 `" && echo ansible-tmp-1726882547.7830362-19305-275522364635080="` echo /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080 `" ) && sleep 0' 19110 1726882547.78737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.78742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.78788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.78869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.78875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.78908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.78996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.80844: stdout chunk (state=3): >>>ansible-tmp-1726882547.7830362-19305-275522364635080=/root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080 <<< 19110 1726882547.80961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.81007: stderr chunk (state=3): >>><<< 19110 1726882547.81009: stdout chunk (state=3): >>><<< 19110 1726882547.81069: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882547.7830362-19305-275522364635080=/root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882547.81073: variable 'ansible_module_compression' from source: unknown 19110 1726882547.81091: ANSIBALLZ: Using generic lock for ansible.legacy.command 19110 1726882547.81094: ANSIBALLZ: Acquiring lock 19110 1726882547.81096: ANSIBALLZ: Lock acquired: 139855634067296 19110 1726882547.81098: ANSIBALLZ: Creating module 19110 1726882547.89584: ANSIBALLZ: Writing module into payload 19110 1726882547.89656: ANSIBALLZ: Writing module 19110 1726882547.89678: ANSIBALLZ: Renaming module 19110 1726882547.89683: ANSIBALLZ: Done creating module 19110 1726882547.89699: variable 'ansible_facts' from source: unknown 19110 1726882547.89747: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/AnsiballZ_command.py 19110 1726882547.89857: Sending initial data 19110 1726882547.89866: Sent initial data (156 bytes) 19110 1726882547.90542: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.90548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.90592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.90604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.90652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.90662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882547.90671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.90791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.92547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882547.92637: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882547.92731: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpdvi36uw_ /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/AnsiballZ_command.py <<< 19110 1726882547.92821: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882547.93809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.93901: stderr chunk (state=3): >>><<< 19110 1726882547.93905: stdout chunk (state=3): >>><<< 19110 1726882547.93922: done transferring module to remote 19110 1726882547.93931: _low_level_execute_command(): starting 19110 1726882547.93935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/ /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/AnsiballZ_command.py && sleep 0' 19110 1726882547.94361: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.94367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.94380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.94404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882547.94415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882547.94428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.94478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.94490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.94587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882547.96315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882547.96356: stderr chunk (state=3): >>><<< 19110 1726882547.96361: stdout chunk (state=3): >>><<< 19110 1726882547.96381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882547.96385: _low_level_execute_command(): starting 19110 1726882547.96389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/AnsiballZ_command.py && sleep 0' 19110 1726882547.96812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882547.96821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882547.96832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882547.96860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882547.96874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882547.96922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882547.96940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882547.97039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.10222: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:48.097493", "end": "2024-09-20 21:35:48.100651", "delta": "0:00:00.003158", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882548.11320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882548.11386: stderr chunk (state=3): >>><<< 19110 1726882548.11390: stdout chunk (state=3): >>><<< 19110 1726882548.11405: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:48.097493", "end": "2024-09-20 21:35:48.100651", "delta": "0:00:00.003158", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882548.11434: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882548.11445: _low_level_execute_command(): starting 19110 1726882548.11449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882547.7830362-19305-275522364635080/ > /dev/null 2>&1 && sleep 0' 19110 1726882548.11920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.11924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.11961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.11976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.12027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.12038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.12140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.13930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.13976: stderr chunk (state=3): >>><<< 19110 1726882548.13979: stdout chunk (state=3): >>><<< 19110 1726882548.13993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.13998: handler run complete 19110 1726882548.14024: Evaluated conditional (False): False 19110 1726882548.14040: attempt loop complete, returning result 19110 1726882548.14043: _execute() done 19110 1726882548.14046: dumping result to json 19110 1726882548.14049: done dumping result, returning 19110 1726882548.14059: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-5372-c19a-000000000115] 19110 1726882548.14065: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000115 19110 1726882548.14159: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000115 19110 1726882548.14162: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003158", "end": "2024-09-20 21:35:48.100651", "rc": 0, "start": "2024-09-20 21:35:48.097493" } STDOUT: bonding_masters eth0 lo 19110 1726882548.14234: no more pending results, returning what we have 19110 1726882548.14237: results queue empty 19110 1726882548.14238: checking for any_errors_fatal 19110 1726882548.14240: done checking for any_errors_fatal 19110 1726882548.14240: checking for max_fail_percentage 19110 1726882548.14242: done checking for max_fail_percentage 19110 1726882548.14242: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.14243: done checking to see if all hosts have failed 19110 1726882548.14244: getting the remaining hosts for this loop 19110 1726882548.14245: done getting the remaining hosts for this loop 19110 1726882548.14249: getting the next task for host managed_node1 19110 1726882548.14254: done getting next task for host managed_node1 19110 1726882548.14257: ^ task is: TASK: Set current_interfaces 19110 1726882548.14261: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.14265: getting variables 19110 1726882548.14267: in VariableManager get_vars() 19110 1726882548.14298: Calling all_inventory to load vars for managed_node1 19110 1726882548.14301: Calling groups_inventory to load vars for managed_node1 19110 1726882548.14304: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.14315: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.14317: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.14320: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.14471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.14589: done with get_vars() 19110 1726882548.14597: done getting variables 19110 1726882548.14638: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:48 -0400 (0:00:00.430) 0:00:05.003 ****** 19110 1726882548.14660: entering _queue_task() for managed_node1/set_fact 19110 1726882548.14852: worker is 1 (out of 1 available) 19110 1726882548.14866: exiting _queue_task() for managed_node1/set_fact 19110 1726882548.14879: done queuing things up, now waiting for results queue to drain 19110 1726882548.14880: waiting for pending results... 19110 1726882548.15026: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 19110 1726882548.15092: in run() - task 0e448fcc-3ce9-5372-c19a-000000000116 19110 1726882548.15102: variable 'ansible_search_path' from source: unknown 19110 1726882548.15105: variable 'ansible_search_path' from source: unknown 19110 1726882548.15134: calling self._execute() 19110 1726882548.15192: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.15196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.15205: variable 'omit' from source: magic vars 19110 1726882548.15463: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.15475: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.15481: variable 'omit' from source: magic vars 19110 1726882548.15511: variable 'omit' from source: magic vars 19110 1726882548.15588: variable '_current_interfaces' from source: set_fact 19110 1726882548.15632: variable 'omit' from source: magic vars 19110 1726882548.15671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.15913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.15928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.15941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.15950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.15976: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.15980: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.15983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.16047: Set connection var ansible_timeout to 10 19110 1726882548.16058: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.16066: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.16069: Set connection var ansible_shell_type to sh 19110 1726882548.16071: Set connection var ansible_connection to ssh 19110 1726882548.16075: Set connection var ansible_pipelining to False 19110 1726882548.16093: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.16096: variable 'ansible_connection' from source: unknown 19110 1726882548.16098: variable 'ansible_module_compression' from source: unknown 19110 1726882548.16102: variable 'ansible_shell_type' from source: unknown 19110 1726882548.16105: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.16107: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.16109: variable 'ansible_pipelining' from source: unknown 19110 1726882548.16112: variable 'ansible_timeout' from source: unknown 19110 1726882548.16115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.16210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.16217: variable 'omit' from source: magic vars 19110 1726882548.16223: starting attempt loop 19110 1726882548.16227: running the handler 19110 1726882548.16237: handler run complete 19110 1726882548.16247: attempt loop complete, returning result 19110 1726882548.16249: _execute() done 19110 1726882548.16252: dumping result to json 19110 1726882548.16257: done dumping result, returning 19110 1726882548.16261: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-5372-c19a-000000000116] 19110 1726882548.16268: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000116 19110 1726882548.16339: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000116 19110 1726882548.16342: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 19110 1726882548.16396: no more pending results, returning what we have 19110 1726882548.16399: results queue empty 19110 1726882548.16400: checking for any_errors_fatal 19110 1726882548.16406: done checking for any_errors_fatal 19110 1726882548.16407: checking for max_fail_percentage 19110 1726882548.16408: done checking for max_fail_percentage 19110 1726882548.16409: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.16409: done checking to see if all hosts have failed 19110 1726882548.16410: getting the remaining hosts for this loop 19110 1726882548.16412: done getting the remaining hosts for this loop 19110 1726882548.16415: getting the next task for host managed_node1 19110 1726882548.16421: done getting next task for host managed_node1 19110 1726882548.16424: ^ task is: TASK: Show current_interfaces 19110 1726882548.16426: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.16429: getting variables 19110 1726882548.16431: in VariableManager get_vars() 19110 1726882548.16452: Calling all_inventory to load vars for managed_node1 19110 1726882548.16457: Calling groups_inventory to load vars for managed_node1 19110 1726882548.16459: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.16469: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.16472: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.16475: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.16746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.16856: done with get_vars() 19110 1726882548.16862: done getting variables 19110 1726882548.16900: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:48 -0400 (0:00:00.022) 0:00:05.026 ****** 19110 1726882548.16919: entering _queue_task() for managed_node1/debug 19110 1726882548.17094: worker is 1 (out of 1 available) 19110 1726882548.17106: exiting _queue_task() for managed_node1/debug 19110 1726882548.17116: done queuing things up, now waiting for results queue to drain 19110 1726882548.17118: waiting for pending results... 19110 1726882548.17256: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 19110 1726882548.17316: in run() - task 0e448fcc-3ce9-5372-c19a-000000000105 19110 1726882548.17326: variable 'ansible_search_path' from source: unknown 19110 1726882548.17330: variable 'ansible_search_path' from source: unknown 19110 1726882548.17362: calling self._execute() 19110 1726882548.17419: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.17425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.17432: variable 'omit' from source: magic vars 19110 1726882548.17703: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.17713: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.17719: variable 'omit' from source: magic vars 19110 1726882548.17744: variable 'omit' from source: magic vars 19110 1726882548.17813: variable 'current_interfaces' from source: set_fact 19110 1726882548.17832: variable 'omit' from source: magic vars 19110 1726882548.17867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.17894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.17911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.17923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.17932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.17953: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.17959: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.17962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.18030: Set connection var ansible_timeout to 10 19110 1726882548.18039: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.18048: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.18051: Set connection var ansible_shell_type to sh 19110 1726882548.18053: Set connection var ansible_connection to ssh 19110 1726882548.18060: Set connection var ansible_pipelining to False 19110 1726882548.18079: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.18082: variable 'ansible_connection' from source: unknown 19110 1726882548.18085: variable 'ansible_module_compression' from source: unknown 19110 1726882548.18087: variable 'ansible_shell_type' from source: unknown 19110 1726882548.18090: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.18092: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.18094: variable 'ansible_pipelining' from source: unknown 19110 1726882548.18096: variable 'ansible_timeout' from source: unknown 19110 1726882548.18102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.18199: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.18206: variable 'omit' from source: magic vars 19110 1726882548.18213: starting attempt loop 19110 1726882548.18216: running the handler 19110 1726882548.18251: handler run complete 19110 1726882548.18265: attempt loop complete, returning result 19110 1726882548.18268: _execute() done 19110 1726882548.18270: dumping result to json 19110 1726882548.18273: done dumping result, returning 19110 1726882548.18279: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-5372-c19a-000000000105] 19110 1726882548.18284: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000105 19110 1726882548.18361: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000105 19110 1726882548.18366: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 19110 1726882548.18408: no more pending results, returning what we have 19110 1726882548.18411: results queue empty 19110 1726882548.18412: checking for any_errors_fatal 19110 1726882548.18416: done checking for any_errors_fatal 19110 1726882548.18416: checking for max_fail_percentage 19110 1726882548.18418: done checking for max_fail_percentage 19110 1726882548.18418: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.18419: done checking to see if all hosts have failed 19110 1726882548.18420: getting the remaining hosts for this loop 19110 1726882548.18421: done getting the remaining hosts for this loop 19110 1726882548.18424: getting the next task for host managed_node1 19110 1726882548.18430: done getting next task for host managed_node1 19110 1726882548.18432: ^ task is: TASK: Include the task 'manage_test_interface.yml' 19110 1726882548.18441: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.18444: getting variables 19110 1726882548.18446: in VariableManager get_vars() 19110 1726882548.18469: Calling all_inventory to load vars for managed_node1 19110 1726882548.18472: Calling groups_inventory to load vars for managed_node1 19110 1726882548.18475: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.18482: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.18484: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.18486: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.18591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.18711: done with get_vars() 19110 1726882548.18718: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 21:35:48 -0400 (0:00:00.018) 0:00:05.045 ****** 19110 1726882548.18780: entering _queue_task() for managed_node1/include_tasks 19110 1726882548.18938: worker is 1 (out of 1 available) 19110 1726882548.18950: exiting _queue_task() for managed_node1/include_tasks 19110 1726882548.18961: done queuing things up, now waiting for results queue to drain 19110 1726882548.18962: waiting for pending results... 19110 1726882548.19095: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 19110 1726882548.19145: in run() - task 0e448fcc-3ce9-5372-c19a-000000000011 19110 1726882548.19155: variable 'ansible_search_path' from source: unknown 19110 1726882548.19185: calling self._execute() 19110 1726882548.19238: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.19242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.19250: variable 'omit' from source: magic vars 19110 1726882548.19497: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.19507: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.19513: _execute() done 19110 1726882548.19516: dumping result to json 19110 1726882548.19518: done dumping result, returning 19110 1726882548.19523: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-5372-c19a-000000000011] 19110 1726882548.19532: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000011 19110 1726882548.19616: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000011 19110 1726882548.19619: WORKER PROCESS EXITING 19110 1726882548.19655: no more pending results, returning what we have 19110 1726882548.19661: in VariableManager get_vars() 19110 1726882548.19689: Calling all_inventory to load vars for managed_node1 19110 1726882548.19692: Calling groups_inventory to load vars for managed_node1 19110 1726882548.19695: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.19701: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.19703: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.19705: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.19840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.19949: done with get_vars() 19110 1726882548.19953: variable 'ansible_search_path' from source: unknown 19110 1726882548.19962: we have included files to process 19110 1726882548.19963: generating all_blocks data 19110 1726882548.19966: done generating all_blocks data 19110 1726882548.19969: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 19110 1726882548.19970: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 19110 1726882548.19972: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 19110 1726882548.20279: in VariableManager get_vars() 19110 1726882548.20290: done with get_vars() 19110 1726882548.20428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19110 1726882548.20800: done processing included file 19110 1726882548.20802: iterating over new_blocks loaded from include file 19110 1726882548.20802: in VariableManager get_vars() 19110 1726882548.20809: done with get_vars() 19110 1726882548.20811: filtering new block on tags 19110 1726882548.20828: done filtering new block on tags 19110 1726882548.20830: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 19110 1726882548.20833: extending task lists for all hosts with included blocks 19110 1726882548.20931: done extending task lists 19110 1726882548.20932: done processing included files 19110 1726882548.20932: results queue empty 19110 1726882548.20933: checking for any_errors_fatal 19110 1726882548.20935: done checking for any_errors_fatal 19110 1726882548.20935: checking for max_fail_percentage 19110 1726882548.20936: done checking for max_fail_percentage 19110 1726882548.20936: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.20937: done checking to see if all hosts have failed 19110 1726882548.20937: getting the remaining hosts for this loop 19110 1726882548.20938: done getting the remaining hosts for this loop 19110 1726882548.20939: getting the next task for host managed_node1 19110 1726882548.20941: done getting next task for host managed_node1 19110 1726882548.20943: ^ task is: TASK: Ensure state in ["present", "absent"] 19110 1726882548.20945: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.20946: getting variables 19110 1726882548.20947: in VariableManager get_vars() 19110 1726882548.20953: Calling all_inventory to load vars for managed_node1 19110 1726882548.20955: Calling groups_inventory to load vars for managed_node1 19110 1726882548.20957: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.20960: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.20961: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.20964: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.21039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.21142: done with get_vars() 19110 1726882548.21147: done getting variables 19110 1726882548.21192: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:35:48 -0400 (0:00:00.024) 0:00:05.069 ****** 19110 1726882548.21208: entering _queue_task() for managed_node1/fail 19110 1726882548.21209: Creating lock for fail 19110 1726882548.21363: worker is 1 (out of 1 available) 19110 1726882548.21376: exiting _queue_task() for managed_node1/fail 19110 1726882548.21386: done queuing things up, now waiting for results queue to drain 19110 1726882548.21387: waiting for pending results... 19110 1726882548.21522: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 19110 1726882548.21576: in run() - task 0e448fcc-3ce9-5372-c19a-000000000131 19110 1726882548.21586: variable 'ansible_search_path' from source: unknown 19110 1726882548.21589: variable 'ansible_search_path' from source: unknown 19110 1726882548.21618: calling self._execute() 19110 1726882548.21668: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.21671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.21681: variable 'omit' from source: magic vars 19110 1726882548.21941: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.21951: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.22040: variable 'state' from source: include params 19110 1726882548.22043: Evaluated conditional (state not in ["present", "absent"]): False 19110 1726882548.22046: when evaluation is False, skipping this task 19110 1726882548.22048: _execute() done 19110 1726882548.22052: dumping result to json 19110 1726882548.22063: done dumping result, returning 19110 1726882548.22070: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-5372-c19a-000000000131] 19110 1726882548.22075: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000131 19110 1726882548.22149: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000131 19110 1726882548.22151: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 19110 1726882548.22202: no more pending results, returning what we have 19110 1726882548.22204: results queue empty 19110 1726882548.22205: checking for any_errors_fatal 19110 1726882548.22206: done checking for any_errors_fatal 19110 1726882548.22207: checking for max_fail_percentage 19110 1726882548.22208: done checking for max_fail_percentage 19110 1726882548.22209: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.22209: done checking to see if all hosts have failed 19110 1726882548.22210: getting the remaining hosts for this loop 19110 1726882548.22211: done getting the remaining hosts for this loop 19110 1726882548.22214: getting the next task for host managed_node1 19110 1726882548.22218: done getting next task for host managed_node1 19110 1726882548.22220: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 19110 1726882548.22223: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.22225: getting variables 19110 1726882548.22227: in VariableManager get_vars() 19110 1726882548.22244: Calling all_inventory to load vars for managed_node1 19110 1726882548.22245: Calling groups_inventory to load vars for managed_node1 19110 1726882548.22247: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.22253: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.22255: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.22257: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.22384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.22494: done with get_vars() 19110 1726882548.22499: done getting variables 19110 1726882548.22534: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:35:48 -0400 (0:00:00.013) 0:00:05.082 ****** 19110 1726882548.22550: entering _queue_task() for managed_node1/fail 19110 1726882548.22700: worker is 1 (out of 1 available) 19110 1726882548.22711: exiting _queue_task() for managed_node1/fail 19110 1726882548.22721: done queuing things up, now waiting for results queue to drain 19110 1726882548.22723: waiting for pending results... 19110 1726882548.22852: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 19110 1726882548.22905: in run() - task 0e448fcc-3ce9-5372-c19a-000000000132 19110 1726882548.22915: variable 'ansible_search_path' from source: unknown 19110 1726882548.22920: variable 'ansible_search_path' from source: unknown 19110 1726882548.22946: calling self._execute() 19110 1726882548.23003: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.23007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.23015: variable 'omit' from source: magic vars 19110 1726882548.23260: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.23269: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.23360: variable 'type' from source: set_fact 19110 1726882548.23363: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 19110 1726882548.23373: when evaluation is False, skipping this task 19110 1726882548.23376: _execute() done 19110 1726882548.23378: dumping result to json 19110 1726882548.23380: done dumping result, returning 19110 1726882548.23384: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-5372-c19a-000000000132] 19110 1726882548.23386: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000132 19110 1726882548.23456: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000132 19110 1726882548.23460: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 19110 1726882548.23526: no more pending results, returning what we have 19110 1726882548.23528: results queue empty 19110 1726882548.23529: checking for any_errors_fatal 19110 1726882548.23533: done checking for any_errors_fatal 19110 1726882548.23534: checking for max_fail_percentage 19110 1726882548.23535: done checking for max_fail_percentage 19110 1726882548.23536: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.23536: done checking to see if all hosts have failed 19110 1726882548.23537: getting the remaining hosts for this loop 19110 1726882548.23538: done getting the remaining hosts for this loop 19110 1726882548.23540: getting the next task for host managed_node1 19110 1726882548.23543: done getting next task for host managed_node1 19110 1726882548.23544: ^ task is: TASK: Include the task 'show_interfaces.yml' 19110 1726882548.23546: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.23549: getting variables 19110 1726882548.23549: in VariableManager get_vars() 19110 1726882548.23571: Calling all_inventory to load vars for managed_node1 19110 1726882548.23572: Calling groups_inventory to load vars for managed_node1 19110 1726882548.23574: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.23580: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.23586: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.23589: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.23685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.23797: done with get_vars() 19110 1726882548.23805: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:35:48 -0400 (0:00:00.013) 0:00:05.095 ****** 19110 1726882548.23862: entering _queue_task() for managed_node1/include_tasks 19110 1726882548.24006: worker is 1 (out of 1 available) 19110 1726882548.24017: exiting _queue_task() for managed_node1/include_tasks 19110 1726882548.24027: done queuing things up, now waiting for results queue to drain 19110 1726882548.24029: waiting for pending results... 19110 1726882548.24152: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 19110 1726882548.24207: in run() - task 0e448fcc-3ce9-5372-c19a-000000000133 19110 1726882548.24217: variable 'ansible_search_path' from source: unknown 19110 1726882548.24220: variable 'ansible_search_path' from source: unknown 19110 1726882548.24248: calling self._execute() 19110 1726882548.24298: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.24301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.24309: variable 'omit' from source: magic vars 19110 1726882548.24569: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.24584: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.24590: _execute() done 19110 1726882548.24593: dumping result to json 19110 1726882548.24597: done dumping result, returning 19110 1726882548.24602: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-5372-c19a-000000000133] 19110 1726882548.24609: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000133 19110 1726882548.24688: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000133 19110 1726882548.24691: WORKER PROCESS EXITING 19110 1726882548.24727: no more pending results, returning what we have 19110 1726882548.24731: in VariableManager get_vars() 19110 1726882548.24757: Calling all_inventory to load vars for managed_node1 19110 1726882548.24759: Calling groups_inventory to load vars for managed_node1 19110 1726882548.24761: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.24769: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.24771: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.24773: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.24899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.25006: done with get_vars() 19110 1726882548.25011: variable 'ansible_search_path' from source: unknown 19110 1726882548.25011: variable 'ansible_search_path' from source: unknown 19110 1726882548.25036: we have included files to process 19110 1726882548.25036: generating all_blocks data 19110 1726882548.25037: done generating all_blocks data 19110 1726882548.25039: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882548.25040: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882548.25041: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19110 1726882548.25103: in VariableManager get_vars() 19110 1726882548.25113: done with get_vars() 19110 1726882548.25184: done processing included file 19110 1726882548.25186: iterating over new_blocks loaded from include file 19110 1726882548.25187: in VariableManager get_vars() 19110 1726882548.25194: done with get_vars() 19110 1726882548.25195: filtering new block on tags 19110 1726882548.25205: done filtering new block on tags 19110 1726882548.25207: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 19110 1726882548.25209: extending task lists for all hosts with included blocks 19110 1726882548.25652: done extending task lists 19110 1726882548.25653: done processing included files 19110 1726882548.25656: results queue empty 19110 1726882548.25656: checking for any_errors_fatal 19110 1726882548.25658: done checking for any_errors_fatal 19110 1726882548.25658: checking for max_fail_percentage 19110 1726882548.25659: done checking for max_fail_percentage 19110 1726882548.25659: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.25661: done checking to see if all hosts have failed 19110 1726882548.25661: getting the remaining hosts for this loop 19110 1726882548.25662: done getting the remaining hosts for this loop 19110 1726882548.25665: getting the next task for host managed_node1 19110 1726882548.25668: done getting next task for host managed_node1 19110 1726882548.25669: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 19110 1726882548.25671: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.25672: getting variables 19110 1726882548.25673: in VariableManager get_vars() 19110 1726882548.25678: Calling all_inventory to load vars for managed_node1 19110 1726882548.25679: Calling groups_inventory to load vars for managed_node1 19110 1726882548.25681: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.25684: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.25685: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.25688: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.25785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.25893: done with get_vars() 19110 1726882548.25898: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:35:48 -0400 (0:00:00.020) 0:00:05.116 ****** 19110 1726882548.25942: entering _queue_task() for managed_node1/include_tasks 19110 1726882548.26091: worker is 1 (out of 1 available) 19110 1726882548.26102: exiting _queue_task() for managed_node1/include_tasks 19110 1726882548.26112: done queuing things up, now waiting for results queue to drain 19110 1726882548.26113: waiting for pending results... 19110 1726882548.26268: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 19110 1726882548.26344: in run() - task 0e448fcc-3ce9-5372-c19a-00000000015c 19110 1726882548.26353: variable 'ansible_search_path' from source: unknown 19110 1726882548.26359: variable 'ansible_search_path' from source: unknown 19110 1726882548.26385: calling self._execute() 19110 1726882548.26434: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.26437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.26445: variable 'omit' from source: magic vars 19110 1726882548.26672: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.26681: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.26687: _execute() done 19110 1726882548.26689: dumping result to json 19110 1726882548.26692: done dumping result, returning 19110 1726882548.26697: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-5372-c19a-00000000015c] 19110 1726882548.26704: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000015c 19110 1726882548.26780: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000015c 19110 1726882548.26783: WORKER PROCESS EXITING 19110 1726882548.26806: no more pending results, returning what we have 19110 1726882548.26810: in VariableManager get_vars() 19110 1726882548.26839: Calling all_inventory to load vars for managed_node1 19110 1726882548.26842: Calling groups_inventory to load vars for managed_node1 19110 1726882548.26845: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.26853: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.26855: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.26858: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.26969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.27183: done with get_vars() 19110 1726882548.27189: variable 'ansible_search_path' from source: unknown 19110 1726882548.27190: variable 'ansible_search_path' from source: unknown 19110 1726882548.27248: we have included files to process 19110 1726882548.27249: generating all_blocks data 19110 1726882548.27251: done generating all_blocks data 19110 1726882548.27252: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882548.27253: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882548.27258: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19110 1726882548.27523: done processing included file 19110 1726882548.27525: iterating over new_blocks loaded from include file 19110 1726882548.27526: in VariableManager get_vars() 19110 1726882548.27538: done with get_vars() 19110 1726882548.27540: filtering new block on tags 19110 1726882548.27559: done filtering new block on tags 19110 1726882548.27561: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 19110 1726882548.27566: extending task lists for all hosts with included blocks 19110 1726882548.27724: done extending task lists 19110 1726882548.27726: done processing included files 19110 1726882548.27726: results queue empty 19110 1726882548.27727: checking for any_errors_fatal 19110 1726882548.27730: done checking for any_errors_fatal 19110 1726882548.27731: checking for max_fail_percentage 19110 1726882548.27731: done checking for max_fail_percentage 19110 1726882548.27732: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.27733: done checking to see if all hosts have failed 19110 1726882548.27734: getting the remaining hosts for this loop 19110 1726882548.27735: done getting the remaining hosts for this loop 19110 1726882548.27737: getting the next task for host managed_node1 19110 1726882548.27741: done getting next task for host managed_node1 19110 1726882548.27743: ^ task is: TASK: Gather current interface info 19110 1726882548.27746: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.27748: getting variables 19110 1726882548.27749: in VariableManager get_vars() 19110 1726882548.27758: Calling all_inventory to load vars for managed_node1 19110 1726882548.27760: Calling groups_inventory to load vars for managed_node1 19110 1726882548.27763: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.27768: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.27771: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.27774: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.28662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.28876: done with get_vars() 19110 1726882548.28884: done getting variables 19110 1726882548.28926: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:35:48 -0400 (0:00:00.030) 0:00:05.146 ****** 19110 1726882548.28956: entering _queue_task() for managed_node1/command 19110 1726882548.29209: worker is 1 (out of 1 available) 19110 1726882548.29221: exiting _queue_task() for managed_node1/command 19110 1726882548.29232: done queuing things up, now waiting for results queue to drain 19110 1726882548.29234: waiting for pending results... 19110 1726882548.29499: running TaskExecutor() for managed_node1/TASK: Gather current interface info 19110 1726882548.29619: in run() - task 0e448fcc-3ce9-5372-c19a-000000000193 19110 1726882548.29637: variable 'ansible_search_path' from source: unknown 19110 1726882548.29644: variable 'ansible_search_path' from source: unknown 19110 1726882548.29696: calling self._execute() 19110 1726882548.29775: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.29790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.29808: variable 'omit' from source: magic vars 19110 1726882548.30184: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.30201: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.30213: variable 'omit' from source: magic vars 19110 1726882548.30284: variable 'omit' from source: magic vars 19110 1726882548.30320: variable 'omit' from source: magic vars 19110 1726882548.30379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.30417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.30450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.30481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.30494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.30524: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.30533: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.30543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.30651: Set connection var ansible_timeout to 10 19110 1726882548.30715: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.30727: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.30734: Set connection var ansible_shell_type to sh 19110 1726882548.30740: Set connection var ansible_connection to ssh 19110 1726882548.30750: Set connection var ansible_pipelining to False 19110 1726882548.30796: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.30808: variable 'ansible_connection' from source: unknown 19110 1726882548.30817: variable 'ansible_module_compression' from source: unknown 19110 1726882548.30825: variable 'ansible_shell_type' from source: unknown 19110 1726882548.30832: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.30839: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.30847: variable 'ansible_pipelining' from source: unknown 19110 1726882548.30853: variable 'ansible_timeout' from source: unknown 19110 1726882548.30867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.31031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.31046: variable 'omit' from source: magic vars 19110 1726882548.31059: starting attempt loop 19110 1726882548.31068: running the handler 19110 1726882548.31085: _low_level_execute_command(): starting 19110 1726882548.31096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882548.31832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.31841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.31892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.31896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.31898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.31940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.31950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.32061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.34172: stdout chunk (state=3): >>>/root <<< 19110 1726882548.34619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.34689: stderr chunk (state=3): >>><<< 19110 1726882548.34692: stdout chunk (state=3): >>><<< 19110 1726882548.34716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.34726: _low_level_execute_command(): starting 19110 1726882548.34731: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386 `" && echo ansible-tmp-1726882548.3471289-19331-67603168999386="` echo /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386 `" ) && sleep 0' 19110 1726882548.36309: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882548.36318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.36328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.36343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.36381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.36477: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882548.36487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.36500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882548.36508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882548.36516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882548.36523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.36532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.36543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.36550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.36559: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882548.36569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.36641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.36687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.36697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.36818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.38701: stdout chunk (state=3): >>>ansible-tmp-1726882548.3471289-19331-67603168999386=/root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386 <<< 19110 1726882548.38867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.38871: stdout chunk (state=3): >>><<< 19110 1726882548.38879: stderr chunk (state=3): >>><<< 19110 1726882548.38895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882548.3471289-19331-67603168999386=/root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.38925: variable 'ansible_module_compression' from source: unknown 19110 1726882548.38979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882548.39012: variable 'ansible_facts' from source: unknown 19110 1726882548.39103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/AnsiballZ_command.py 19110 1726882548.39721: Sending initial data 19110 1726882548.39724: Sent initial data (155 bytes) 19110 1726882548.42251: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.42258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.42300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882548.42305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.42324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.42330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.42512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.42532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.42651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.44431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882548.44520: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882548.44613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp3hu2lzl7 /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/AnsiballZ_command.py <<< 19110 1726882548.44703: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882548.46178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.46356: stderr chunk (state=3): >>><<< 19110 1726882548.46360: stdout chunk (state=3): >>><<< 19110 1726882548.46362: done transferring module to remote 19110 1726882548.46374: _low_level_execute_command(): starting 19110 1726882548.46376: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/ /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/AnsiballZ_command.py && sleep 0' 19110 1726882548.47862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882548.47882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.47905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.47927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.47978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.48026: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882548.48042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.48066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882548.48135: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882548.48148: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882548.48166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.48181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.48198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.48210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.48221: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882548.48244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.48326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.48468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.48484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.48681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.50517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.50520: stdout chunk (state=3): >>><<< 19110 1726882548.50523: stderr chunk (state=3): >>><<< 19110 1726882548.50614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.50617: _low_level_execute_command(): starting 19110 1726882548.50622: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/AnsiballZ_command.py && sleep 0' 19110 1726882548.52032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.52035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.52075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.52078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882548.52081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.52135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.52394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.52401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.52507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.65868: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:48.654077", "end": "2024-09-20 21:35:48.657219", "delta": "0:00:00.003142", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882548.66980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882548.67069: stderr chunk (state=3): >>><<< 19110 1726882548.67072: stdout chunk (state=3): >>><<< 19110 1726882548.67218: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:35:48.654077", "end": "2024-09-20 21:35:48.657219", "delta": "0:00:00.003142", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882548.67222: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882548.67225: _low_level_execute_command(): starting 19110 1726882548.67227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882548.3471289-19331-67603168999386/ > /dev/null 2>&1 && sleep 0' 19110 1726882548.68639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882548.68762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.68780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.68797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.68840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.68869: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882548.68978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.68997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882548.69008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882548.69020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882548.69032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.69044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.69061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.69078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.69090: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882548.69103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.69187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.69309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.69323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.69527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.71407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.71411: stdout chunk (state=3): >>><<< 19110 1726882548.71413: stderr chunk (state=3): >>><<< 19110 1726882548.71472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.71475: handler run complete 19110 1726882548.71478: Evaluated conditional (False): False 19110 1726882548.71770: attempt loop complete, returning result 19110 1726882548.71773: _execute() done 19110 1726882548.71775: dumping result to json 19110 1726882548.71777: done dumping result, returning 19110 1726882548.71778: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0e448fcc-3ce9-5372-c19a-000000000193] 19110 1726882548.71780: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000193 19110 1726882548.71846: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000193 19110 1726882548.71849: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003142", "end": "2024-09-20 21:35:48.657219", "rc": 0, "start": "2024-09-20 21:35:48.654077" } STDOUT: bonding_masters eth0 lo 19110 1726882548.71951: no more pending results, returning what we have 19110 1726882548.71957: results queue empty 19110 1726882548.71958: checking for any_errors_fatal 19110 1726882548.71959: done checking for any_errors_fatal 19110 1726882548.71960: checking for max_fail_percentage 19110 1726882548.71962: done checking for max_fail_percentage 19110 1726882548.71966: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.71967: done checking to see if all hosts have failed 19110 1726882548.71968: getting the remaining hosts for this loop 19110 1726882548.71969: done getting the remaining hosts for this loop 19110 1726882548.71973: getting the next task for host managed_node1 19110 1726882548.71980: done getting next task for host managed_node1 19110 1726882548.71982: ^ task is: TASK: Set current_interfaces 19110 1726882548.71987: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.71991: getting variables 19110 1726882548.71993: in VariableManager get_vars() 19110 1726882548.72022: Calling all_inventory to load vars for managed_node1 19110 1726882548.72025: Calling groups_inventory to load vars for managed_node1 19110 1726882548.72028: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.72038: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.72041: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.72044: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.72226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.72468: done with get_vars() 19110 1726882548.72479: done getting variables 19110 1726882548.72539: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:35:48 -0400 (0:00:00.436) 0:00:05.583 ****** 19110 1726882548.72575: entering _queue_task() for managed_node1/set_fact 19110 1726882548.72822: worker is 1 (out of 1 available) 19110 1726882548.72834: exiting _queue_task() for managed_node1/set_fact 19110 1726882548.72847: done queuing things up, now waiting for results queue to drain 19110 1726882548.72848: waiting for pending results... 19110 1726882548.73577: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 19110 1726882548.73900: in run() - task 0e448fcc-3ce9-5372-c19a-000000000194 19110 1726882548.73913: variable 'ansible_search_path' from source: unknown 19110 1726882548.73916: variable 'ansible_search_path' from source: unknown 19110 1726882548.73951: calling self._execute() 19110 1726882548.74160: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.74165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.74174: variable 'omit' from source: magic vars 19110 1726882548.75048: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.75216: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.75228: variable 'omit' from source: magic vars 19110 1726882548.75289: variable 'omit' from source: magic vars 19110 1726882548.75534: variable '_current_interfaces' from source: set_fact 19110 1726882548.75602: variable 'omit' from source: magic vars 19110 1726882548.75787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.75825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.75894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.75982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.75998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.76030: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.76078: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.76088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.76306: Set connection var ansible_timeout to 10 19110 1726882548.77084: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.77380: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.77388: Set connection var ansible_shell_type to sh 19110 1726882548.77394: Set connection var ansible_connection to ssh 19110 1726882548.77401: Set connection var ansible_pipelining to False 19110 1726882548.77427: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.77475: variable 'ansible_connection' from source: unknown 19110 1726882548.77482: variable 'ansible_module_compression' from source: unknown 19110 1726882548.77488: variable 'ansible_shell_type' from source: unknown 19110 1726882548.77494: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.77500: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.77507: variable 'ansible_pipelining' from source: unknown 19110 1726882548.77512: variable 'ansible_timeout' from source: unknown 19110 1726882548.77519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.77923: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.77937: variable 'omit' from source: magic vars 19110 1726882548.77946: starting attempt loop 19110 1726882548.77953: running the handler 19110 1726882548.77972: handler run complete 19110 1726882548.77987: attempt loop complete, returning result 19110 1726882548.77992: _execute() done 19110 1726882548.78000: dumping result to json 19110 1726882548.78007: done dumping result, returning 19110 1726882548.78017: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0e448fcc-3ce9-5372-c19a-000000000194] 19110 1726882548.78025: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000194 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 19110 1726882548.78177: no more pending results, returning what we have 19110 1726882548.78181: results queue empty 19110 1726882548.78182: checking for any_errors_fatal 19110 1726882548.78188: done checking for any_errors_fatal 19110 1726882548.78189: checking for max_fail_percentage 19110 1726882548.78190: done checking for max_fail_percentage 19110 1726882548.78191: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.78192: done checking to see if all hosts have failed 19110 1726882548.78192: getting the remaining hosts for this loop 19110 1726882548.78194: done getting the remaining hosts for this loop 19110 1726882548.78197: getting the next task for host managed_node1 19110 1726882548.78204: done getting next task for host managed_node1 19110 1726882548.78206: ^ task is: TASK: Show current_interfaces 19110 1726882548.78210: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.78213: getting variables 19110 1726882548.78214: in VariableManager get_vars() 19110 1726882548.78242: Calling all_inventory to load vars for managed_node1 19110 1726882548.78245: Calling groups_inventory to load vars for managed_node1 19110 1726882548.78248: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.78263: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.78268: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.78272: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.78484: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000194 19110 1726882548.78488: WORKER PROCESS EXITING 19110 1726882548.78501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.78690: done with get_vars() 19110 1726882548.78700: done getting variables 19110 1726882548.78757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:35:48 -0400 (0:00:00.062) 0:00:05.645 ****** 19110 1726882548.78788: entering _queue_task() for managed_node1/debug 19110 1726882548.79802: worker is 1 (out of 1 available) 19110 1726882548.79814: exiting _queue_task() for managed_node1/debug 19110 1726882548.79825: done queuing things up, now waiting for results queue to drain 19110 1726882548.79826: waiting for pending results... 19110 1726882548.80893: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 19110 1726882548.81005: in run() - task 0e448fcc-3ce9-5372-c19a-00000000015d 19110 1726882548.81062: variable 'ansible_search_path' from source: unknown 19110 1726882548.81073: variable 'ansible_search_path' from source: unknown 19110 1726882548.81110: calling self._execute() 19110 1726882548.81187: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.81198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.81210: variable 'omit' from source: magic vars 19110 1726882548.81570: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.81995: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.82060: variable 'omit' from source: magic vars 19110 1726882548.82111: variable 'omit' from source: magic vars 19110 1726882548.82212: variable 'current_interfaces' from source: set_fact 19110 1726882548.82698: variable 'omit' from source: magic vars 19110 1726882548.82742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.82787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.82812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.82835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.82851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.83048: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.83061: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.83074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.83179: Set connection var ansible_timeout to 10 19110 1726882548.83197: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.83207: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.83214: Set connection var ansible_shell_type to sh 19110 1726882548.83220: Set connection var ansible_connection to ssh 19110 1726882548.83230: Set connection var ansible_pipelining to False 19110 1726882548.83259: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.83272: variable 'ansible_connection' from source: unknown 19110 1726882548.83281: variable 'ansible_module_compression' from source: unknown 19110 1726882548.83288: variable 'ansible_shell_type' from source: unknown 19110 1726882548.83295: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.83301: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.83309: variable 'ansible_pipelining' from source: unknown 19110 1726882548.83316: variable 'ansible_timeout' from source: unknown 19110 1726882548.83324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.83457: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.83869: variable 'omit' from source: magic vars 19110 1726882548.83875: starting attempt loop 19110 1726882548.83881: running the handler 19110 1726882548.83933: handler run complete 19110 1726882548.83956: attempt loop complete, returning result 19110 1726882548.83966: _execute() done 19110 1726882548.83974: dumping result to json 19110 1726882548.83981: done dumping result, returning 19110 1726882548.83994: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0e448fcc-3ce9-5372-c19a-00000000015d] 19110 1726882548.84004: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000015d ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 19110 1726882548.84154: no more pending results, returning what we have 19110 1726882548.84157: results queue empty 19110 1726882548.84159: checking for any_errors_fatal 19110 1726882548.84164: done checking for any_errors_fatal 19110 1726882548.84165: checking for max_fail_percentage 19110 1726882548.84167: done checking for max_fail_percentage 19110 1726882548.84168: checking to see if all hosts have failed and the running result is not ok 19110 1726882548.84169: done checking to see if all hosts have failed 19110 1726882548.84169: getting the remaining hosts for this loop 19110 1726882548.84171: done getting the remaining hosts for this loop 19110 1726882548.84174: getting the next task for host managed_node1 19110 1726882548.84182: done getting next task for host managed_node1 19110 1726882548.84185: ^ task is: TASK: Install iproute 19110 1726882548.84188: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882548.84192: getting variables 19110 1726882548.84194: in VariableManager get_vars() 19110 1726882548.84222: Calling all_inventory to load vars for managed_node1 19110 1726882548.84225: Calling groups_inventory to load vars for managed_node1 19110 1726882548.84228: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882548.84238: Calling all_plugins_play to load vars for managed_node1 19110 1726882548.84241: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882548.84243: Calling groups_plugins_play to load vars for managed_node1 19110 1726882548.84388: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000015d 19110 1726882548.84391: WORKER PROCESS EXITING 19110 1726882548.84410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882548.84608: done with get_vars() 19110 1726882548.84619: done getting variables 19110 1726882548.84712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:35:48 -0400 (0:00:00.059) 0:00:05.704 ****** 19110 1726882548.84742: entering _queue_task() for managed_node1/package 19110 1726882548.85041: worker is 1 (out of 1 available) 19110 1726882548.85055: exiting _queue_task() for managed_node1/package 19110 1726882548.85069: done queuing things up, now waiting for results queue to drain 19110 1726882548.85071: waiting for pending results... 19110 1726882548.85828: running TaskExecutor() for managed_node1/TASK: Install iproute 19110 1726882548.85926: in run() - task 0e448fcc-3ce9-5372-c19a-000000000134 19110 1726882548.85947: variable 'ansible_search_path' from source: unknown 19110 1726882548.85957: variable 'ansible_search_path' from source: unknown 19110 1726882548.86006: calling self._execute() 19110 1726882548.86092: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.86103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.86118: variable 'omit' from source: magic vars 19110 1726882548.86552: variable 'ansible_distribution_major_version' from source: facts 19110 1726882548.86577: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882548.86588: variable 'omit' from source: magic vars 19110 1726882548.86635: variable 'omit' from source: magic vars 19110 1726882548.86866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882548.89240: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882548.89320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882548.89481: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882548.89521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882548.89558: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882548.89653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882548.89802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882548.89907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882548.89948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882548.90004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882548.90219: variable '__network_is_ostree' from source: set_fact 19110 1726882548.90228: variable 'omit' from source: magic vars 19110 1726882548.90260: variable 'omit' from source: magic vars 19110 1726882548.90340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882548.90375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882548.90438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882548.90558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.90576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882548.90604: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882548.90611: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.90617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.90895: Set connection var ansible_timeout to 10 19110 1726882548.90912: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882548.90922: Set connection var ansible_shell_executable to /bin/sh 19110 1726882548.90928: Set connection var ansible_shell_type to sh 19110 1726882548.90934: Set connection var ansible_connection to ssh 19110 1726882548.90943: Set connection var ansible_pipelining to False 19110 1726882548.90995: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.91004: variable 'ansible_connection' from source: unknown 19110 1726882548.91011: variable 'ansible_module_compression' from source: unknown 19110 1726882548.91018: variable 'ansible_shell_type' from source: unknown 19110 1726882548.91024: variable 'ansible_shell_executable' from source: unknown 19110 1726882548.91031: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882548.91038: variable 'ansible_pipelining' from source: unknown 19110 1726882548.91044: variable 'ansible_timeout' from source: unknown 19110 1726882548.91052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882548.91171: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882548.91199: variable 'omit' from source: magic vars 19110 1726882548.91210: starting attempt loop 19110 1726882548.91217: running the handler 19110 1726882548.91228: variable 'ansible_facts' from source: unknown 19110 1726882548.91233: variable 'ansible_facts' from source: unknown 19110 1726882548.91272: _low_level_execute_command(): starting 19110 1726882548.91288: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882548.92017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882548.92033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.92051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.92078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.92120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.92133: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882548.92147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.92172: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882548.92185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882548.92195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882548.92206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.92219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.92233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.92244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.92256: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882548.92281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.92361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.92397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.92414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.92630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.94180: stdout chunk (state=3): >>>/root <<< 19110 1726882548.94358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.94362: stdout chunk (state=3): >>><<< 19110 1726882548.94364: stderr chunk (state=3): >>><<< 19110 1726882548.94464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.94468: _low_level_execute_command(): starting 19110 1726882548.94471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515 `" && echo ansible-tmp-1726882548.9438553-19357-69840317055515="` echo /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515 `" ) && sleep 0' 19110 1726882548.95019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882548.95035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.95049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.95071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.95110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.95124: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882548.95142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.95161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882548.95174: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882548.95184: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882548.95195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882548.95206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882548.95220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882548.95232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882548.95245: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882548.95261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882548.95338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882548.95359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882548.95376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882548.95500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882548.97386: stdout chunk (state=3): >>>ansible-tmp-1726882548.9438553-19357-69840317055515=/root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515 <<< 19110 1726882548.97557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882548.97560: stdout chunk (state=3): >>><<< 19110 1726882548.97564: stderr chunk (state=3): >>><<< 19110 1726882548.97596: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882548.9438553-19357-69840317055515=/root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882548.97627: variable 'ansible_module_compression' from source: unknown 19110 1726882548.97690: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 19110 1726882548.97694: ANSIBALLZ: Acquiring lock 19110 1726882548.97696: ANSIBALLZ: Lock acquired: 139855634067296 19110 1726882548.97698: ANSIBALLZ: Creating module 19110 1726882549.16975: ANSIBALLZ: Writing module into payload 19110 1726882549.17264: ANSIBALLZ: Writing module 19110 1726882549.17299: ANSIBALLZ: Renaming module 19110 1726882549.17314: ANSIBALLZ: Done creating module 19110 1726882549.17336: variable 'ansible_facts' from source: unknown 19110 1726882549.17448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/AnsiballZ_dnf.py 19110 1726882549.17610: Sending initial data 19110 1726882549.17613: Sent initial data (151 bytes) 19110 1726882549.18619: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882549.18633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.18646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.18667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.18717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.18730: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882549.18744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.18766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882549.18780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882549.18791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882549.18814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.18829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.18845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.18857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.18870: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882549.18884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.18962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882549.18980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882549.18993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882549.19123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882549.20951: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882549.21052: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882549.21149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp_fcvjht8 /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/AnsiballZ_dnf.py <<< 19110 1726882549.21238: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882549.22959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882549.22963: stdout chunk (state=3): >>><<< 19110 1726882549.22971: stderr chunk (state=3): >>><<< 19110 1726882549.22992: done transferring module to remote 19110 1726882549.23001: _low_level_execute_command(): starting 19110 1726882549.23006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/ /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/AnsiballZ_dnf.py && sleep 0' 19110 1726882549.23626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882549.23634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.23644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.23659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.23698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.23705: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882549.23715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.23728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882549.23735: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882549.23781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882549.23786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.23788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.24013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.24020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.24028: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882549.24037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.24115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882549.24123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882549.24127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882549.24243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882549.26047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882549.26051: stdout chunk (state=3): >>><<< 19110 1726882549.26058: stderr chunk (state=3): >>><<< 19110 1726882549.26080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882549.26083: _low_level_execute_command(): starting 19110 1726882549.26088: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/AnsiballZ_dnf.py && sleep 0' 19110 1726882549.26684: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882549.26693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.26702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.26716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.26759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.26763: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882549.26776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.26789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882549.26796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882549.26802: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882549.26810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882549.26818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882549.26831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882549.26842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882549.26850: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882549.26975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882549.26978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882549.27381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882549.27385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882549.27387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.27688: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 19110 1726882550.33387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882550.33391: stdout chunk (state=3): >>><<< 19110 1726882550.33396: stderr chunk (state=3): >>><<< 19110 1726882550.33420: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882550.33468: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882550.33474: _low_level_execute_command(): starting 19110 1726882550.33480: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882548.9438553-19357-69840317055515/ > /dev/null 2>&1 && sleep 0' 19110 1726882550.34141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.34150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.34161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.34179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.34225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.34228: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.34237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.34251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.34259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.34268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.34278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.34287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.34300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.34309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.34317: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.34330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.34402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.34426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.34432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.34553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.36433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.36436: stdout chunk (state=3): >>><<< 19110 1726882550.36444: stderr chunk (state=3): >>><<< 19110 1726882550.36459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.36469: handler run complete 19110 1726882550.36629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882550.36809: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882550.36851: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882550.36886: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882550.36916: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882550.36988: variable '__install_status' from source: unknown 19110 1726882550.37008: Evaluated conditional (__install_status is success): True 19110 1726882550.37024: attempt loop complete, returning result 19110 1726882550.37027: _execute() done 19110 1726882550.37030: dumping result to json 19110 1726882550.37036: done dumping result, returning 19110 1726882550.37048: done running TaskExecutor() for managed_node1/TASK: Install iproute [0e448fcc-3ce9-5372-c19a-000000000134] 19110 1726882550.37051: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000134 19110 1726882550.37151: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000134 19110 1726882550.37153: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 19110 1726882550.37303: no more pending results, returning what we have 19110 1726882550.37306: results queue empty 19110 1726882550.37307: checking for any_errors_fatal 19110 1726882550.37310: done checking for any_errors_fatal 19110 1726882550.37311: checking for max_fail_percentage 19110 1726882550.37312: done checking for max_fail_percentage 19110 1726882550.37313: checking to see if all hosts have failed and the running result is not ok 19110 1726882550.37314: done checking to see if all hosts have failed 19110 1726882550.37314: getting the remaining hosts for this loop 19110 1726882550.37316: done getting the remaining hosts for this loop 19110 1726882550.37319: getting the next task for host managed_node1 19110 1726882550.37324: done getting next task for host managed_node1 19110 1726882550.37326: ^ task is: TASK: Create veth interface {{ interface }} 19110 1726882550.37329: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882550.37331: getting variables 19110 1726882550.37332: in VariableManager get_vars() 19110 1726882550.37351: Calling all_inventory to load vars for managed_node1 19110 1726882550.37356: Calling groups_inventory to load vars for managed_node1 19110 1726882550.37360: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882550.37370: Calling all_plugins_play to load vars for managed_node1 19110 1726882550.37372: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882550.37375: Calling groups_plugins_play to load vars for managed_node1 19110 1726882550.37549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882550.37804: done with get_vars() 19110 1726882550.37813: done getting variables 19110 1726882550.37890: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882550.38024: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:35:50 -0400 (0:00:01.533) 0:00:07.237 ****** 19110 1726882550.38053: entering _queue_task() for managed_node1/command 19110 1726882550.38349: worker is 1 (out of 1 available) 19110 1726882550.38367: exiting _queue_task() for managed_node1/command 19110 1726882550.38379: done queuing things up, now waiting for results queue to drain 19110 1726882550.38381: waiting for pending results... 19110 1726882550.38645: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 19110 1726882550.38805: in run() - task 0e448fcc-3ce9-5372-c19a-000000000135 19110 1726882550.38827: variable 'ansible_search_path' from source: unknown 19110 1726882550.38835: variable 'ansible_search_path' from source: unknown 19110 1726882550.39116: variable 'interface' from source: set_fact 19110 1726882550.39214: variable 'interface' from source: set_fact 19110 1726882550.39316: variable 'interface' from source: set_fact 19110 1726882550.39473: Loaded config def from plugin (lookup/items) 19110 1726882550.39489: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 19110 1726882550.39513: variable 'omit' from source: magic vars 19110 1726882550.39637: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.39660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.39679: variable 'omit' from source: magic vars 19110 1726882550.39925: variable 'ansible_distribution_major_version' from source: facts 19110 1726882550.39938: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882550.40169: variable 'type' from source: set_fact 19110 1726882550.40179: variable 'state' from source: include params 19110 1726882550.40191: variable 'interface' from source: set_fact 19110 1726882550.40203: variable 'current_interfaces' from source: set_fact 19110 1726882550.40212: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 19110 1726882550.40222: variable 'omit' from source: magic vars 19110 1726882550.40270: variable 'omit' from source: magic vars 19110 1726882550.40332: variable 'item' from source: unknown 19110 1726882550.40418: variable 'item' from source: unknown 19110 1726882550.40475: variable 'omit' from source: magic vars 19110 1726882550.40536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882550.40575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882550.40597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882550.40620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882550.40642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882550.40682: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882550.40690: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.40697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.40810: Set connection var ansible_timeout to 10 19110 1726882550.40827: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882550.40836: Set connection var ansible_shell_executable to /bin/sh 19110 1726882550.40847: Set connection var ansible_shell_type to sh 19110 1726882550.40861: Set connection var ansible_connection to ssh 19110 1726882550.40872: Set connection var ansible_pipelining to False 19110 1726882550.40899: variable 'ansible_shell_executable' from source: unknown 19110 1726882550.40906: variable 'ansible_connection' from source: unknown 19110 1726882550.40913: variable 'ansible_module_compression' from source: unknown 19110 1726882550.40919: variable 'ansible_shell_type' from source: unknown 19110 1726882550.40925: variable 'ansible_shell_executable' from source: unknown 19110 1726882550.40931: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.40938: variable 'ansible_pipelining' from source: unknown 19110 1726882550.40944: variable 'ansible_timeout' from source: unknown 19110 1726882550.40959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.41107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882550.41121: variable 'omit' from source: magic vars 19110 1726882550.41130: starting attempt loop 19110 1726882550.41136: running the handler 19110 1726882550.41152: _low_level_execute_command(): starting 19110 1726882550.41173: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882550.41946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.41967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.41984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.42002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.42050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.42069: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.42089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.42109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.42121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.42131: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.42142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.42162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.42183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.42199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.42212: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.42228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.42316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.42338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.42353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.42493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.44058: stdout chunk (state=3): >>>/root <<< 19110 1726882550.44167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.44240: stderr chunk (state=3): >>><<< 19110 1726882550.44261: stdout chunk (state=3): >>><<< 19110 1726882550.44379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.44388: _low_level_execute_command(): starting 19110 1726882550.44391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490 `" && echo ansible-tmp-1726882550.442934-19422-137875454822490="` echo /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490 `" ) && sleep 0' 19110 1726882550.46003: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.46069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.46088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.46107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.46150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.46282: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.46296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.46314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.46325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.46335: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.46346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.46360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.46383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.46396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.46406: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.46419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.46502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.46523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.46579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.46707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.48578: stdout chunk (state=3): >>>ansible-tmp-1726882550.442934-19422-137875454822490=/root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490 <<< 19110 1726882550.48784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.48787: stdout chunk (state=3): >>><<< 19110 1726882550.48789: stderr chunk (state=3): >>><<< 19110 1726882550.49013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882550.442934-19422-137875454822490=/root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.49016: variable 'ansible_module_compression' from source: unknown 19110 1726882550.49019: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882550.49021: variable 'ansible_facts' from source: unknown 19110 1726882550.49028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/AnsiballZ_command.py 19110 1726882550.49660: Sending initial data 19110 1726882550.49665: Sent initial data (155 bytes) 19110 1726882550.50818: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.50821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.50849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.50857: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.50866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.50883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.50891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.50898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.50906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.50916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.50928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.50935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.50945: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.50959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.51029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.51043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.51074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.51179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.52908: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882550.53001: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882550.53098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpsv8x6mzo /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/AnsiballZ_command.py <<< 19110 1726882550.53187: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882550.54588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.54666: stderr chunk (state=3): >>><<< 19110 1726882550.54670: stdout chunk (state=3): >>><<< 19110 1726882550.54691: done transferring module to remote 19110 1726882550.54703: _low_level_execute_command(): starting 19110 1726882550.54708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/ /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/AnsiballZ_command.py && sleep 0' 19110 1726882550.55370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.55378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.55388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.55401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.55446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.55452: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.55462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.55478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.55485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.55492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.55499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.55508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.55519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.55525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.55532: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.55547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.55620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.55637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.55650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.55780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.57544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.57550: stdout chunk (state=3): >>><<< 19110 1726882550.57557: stderr chunk (state=3): >>><<< 19110 1726882550.57573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.57576: _low_level_execute_command(): starting 19110 1726882550.57582: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/AnsiballZ_command.py && sleep 0' 19110 1726882550.58493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.58503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.58514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.58529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.58572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.58580: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.58592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.58606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.58615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.58622: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.58631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.58641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.58656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.58662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.58671: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.58682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.58750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.58770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.58785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.58907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.72568: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:35:50.718228", "end": "2024-09-20 21:35:50.724008", "delta": "0:00:00.005780", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882550.74533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882550.74569: stderr chunk (state=3): >>><<< 19110 1726882550.74573: stdout chunk (state=3): >>><<< 19110 1726882550.74589: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:35:50.718228", "end": "2024-09-20 21:35:50.724008", "delta": "0:00:00.005780", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882550.74618: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882550.74627: _low_level_execute_command(): starting 19110 1726882550.74632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882550.442934-19422-137875454822490/ > /dev/null 2>&1 && sleep 0' 19110 1726882550.75081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.75084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.75097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.75103: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.75124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.75128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.75177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.75187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.75563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.78545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.78604: stderr chunk (state=3): >>><<< 19110 1726882550.78608: stdout chunk (state=3): >>><<< 19110 1726882550.78623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.78629: handler run complete 19110 1726882550.78647: Evaluated conditional (False): False 19110 1726882550.78658: attempt loop complete, returning result 19110 1726882550.78679: variable 'item' from source: unknown 19110 1726882550.78740: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005780", "end": "2024-09-20 21:35:50.724008", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 21:35:50.718228" } 19110 1726882550.78916: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.78919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.78922: variable 'omit' from source: magic vars 19110 1726882550.78987: variable 'ansible_distribution_major_version' from source: facts 19110 1726882550.78991: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882550.79112: variable 'type' from source: set_fact 19110 1726882550.79115: variable 'state' from source: include params 19110 1726882550.79118: variable 'interface' from source: set_fact 19110 1726882550.79121: variable 'current_interfaces' from source: set_fact 19110 1726882550.79127: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 19110 1726882550.79131: variable 'omit' from source: magic vars 19110 1726882550.79149: variable 'omit' from source: magic vars 19110 1726882550.79174: variable 'item' from source: unknown 19110 1726882550.79216: variable 'item' from source: unknown 19110 1726882550.79227: variable 'omit' from source: magic vars 19110 1726882550.79243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882550.79260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882550.79263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882550.79271: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882550.79274: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.79277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.79325: Set connection var ansible_timeout to 10 19110 1726882550.79333: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882550.79338: Set connection var ansible_shell_executable to /bin/sh 19110 1726882550.79340: Set connection var ansible_shell_type to sh 19110 1726882550.79343: Set connection var ansible_connection to ssh 19110 1726882550.79347: Set connection var ansible_pipelining to False 19110 1726882550.79363: variable 'ansible_shell_executable' from source: unknown 19110 1726882550.79372: variable 'ansible_connection' from source: unknown 19110 1726882550.79375: variable 'ansible_module_compression' from source: unknown 19110 1726882550.79377: variable 'ansible_shell_type' from source: unknown 19110 1726882550.79379: variable 'ansible_shell_executable' from source: unknown 19110 1726882550.79381: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882550.79383: variable 'ansible_pipelining' from source: unknown 19110 1726882550.79420: variable 'ansible_timeout' from source: unknown 19110 1726882550.79423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882550.79505: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882550.79533: variable 'omit' from source: magic vars 19110 1726882550.79558: starting attempt loop 19110 1726882550.79569: running the handler 19110 1726882550.79591: _low_level_execute_command(): starting 19110 1726882550.79600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882550.80318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.80340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.80366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.80392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.80475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.80479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.80481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.80531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.80546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.80650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.82250: stdout chunk (state=3): >>>/root <<< 19110 1726882550.82351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.82393: stderr chunk (state=3): >>><<< 19110 1726882550.82397: stdout chunk (state=3): >>><<< 19110 1726882550.82412: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.82419: _low_level_execute_command(): starting 19110 1726882550.82425: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350 `" && echo ansible-tmp-1726882550.8241167-19422-263012859576350="` echo /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350 `" ) && sleep 0' 19110 1726882550.82843: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.82846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.82882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882550.82885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19110 1726882550.82887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.82890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.82935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.82939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.83046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.84893: stdout chunk (state=3): >>>ansible-tmp-1726882550.8241167-19422-263012859576350=/root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350 <<< 19110 1726882550.85002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.85060: stderr chunk (state=3): >>><<< 19110 1726882550.85066: stdout chunk (state=3): >>><<< 19110 1726882550.85080: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882550.8241167-19422-263012859576350=/root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.85099: variable 'ansible_module_compression' from source: unknown 19110 1726882550.85135: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882550.85150: variable 'ansible_facts' from source: unknown 19110 1726882550.85199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/AnsiballZ_command.py 19110 1726882550.85299: Sending initial data 19110 1726882550.85302: Sent initial data (156 bytes) 19110 1726882550.85974: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.85978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.86014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.86017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.86019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.86069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.86077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.86183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.87907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882550.88000: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882550.88096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpzy96pua6 /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/AnsiballZ_command.py <<< 19110 1726882550.88188: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882550.89573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.89653: stderr chunk (state=3): >>><<< 19110 1726882550.89656: stdout chunk (state=3): >>><<< 19110 1726882550.89686: done transferring module to remote 19110 1726882550.89694: _low_level_execute_command(): starting 19110 1726882550.89699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/ /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/AnsiballZ_command.py && sleep 0' 19110 1726882550.90692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.90698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.90803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.90809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.90822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882550.90827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.90936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.90939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.90954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.91078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882550.92969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882550.92973: stdout chunk (state=3): >>><<< 19110 1726882550.92975: stderr chunk (state=3): >>><<< 19110 1726882550.93320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882550.93324: _low_level_execute_command(): starting 19110 1726882550.93326: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/AnsiballZ_command.py && sleep 0' 19110 1726882550.93886: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882550.93900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.93914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.93930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.93977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.93992: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882550.94006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.94022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882550.94032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882550.94042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882550.94052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882550.94067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882550.94082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882550.94097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882550.94110: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882550.94124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882550.94203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882550.94226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882550.94240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882550.94375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.07695: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:35:51.071865", "end": "2024-09-20 21:35:51.075430", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882551.08886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882551.08890: stdout chunk (state=3): >>><<< 19110 1726882551.08896: stderr chunk (state=3): >>><<< 19110 1726882551.08915: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:35:51.071865", "end": "2024-09-20 21:35:51.075430", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882551.08946: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882551.08953: _low_level_execute_command(): starting 19110 1726882551.08968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882550.8241167-19422-263012859576350/ > /dev/null 2>&1 && sleep 0' 19110 1726882551.09607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.09616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.09628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.09641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.09683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.09690: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.09700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.09714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.09721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.09728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.09736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.09745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.09756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.09771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.09781: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.09787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.09855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.09879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.09891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.10034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.11852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.11858: stdout chunk (state=3): >>><<< 19110 1726882551.11861: stderr chunk (state=3): >>><<< 19110 1726882551.11887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.11892: handler run complete 19110 1726882551.11913: Evaluated conditional (False): False 19110 1726882551.11923: attempt loop complete, returning result 19110 1726882551.11942: variable 'item' from source: unknown 19110 1726882551.12015: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003565", "end": "2024-09-20 21:35:51.075430", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 21:35:51.071865" } 19110 1726882551.12136: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.12139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.12141: variable 'omit' from source: magic vars 19110 1726882551.12432: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.12443: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.12634: variable 'type' from source: set_fact 19110 1726882551.12643: variable 'state' from source: include params 19110 1726882551.12650: variable 'interface' from source: set_fact 19110 1726882551.12661: variable 'current_interfaces' from source: set_fact 19110 1726882551.12674: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 19110 1726882551.12681: variable 'omit' from source: magic vars 19110 1726882551.12698: variable 'omit' from source: magic vars 19110 1726882551.12737: variable 'item' from source: unknown 19110 1726882551.12804: variable 'item' from source: unknown 19110 1726882551.12823: variable 'omit' from source: magic vars 19110 1726882551.12846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882551.12860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882551.12874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882551.12891: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882551.12897: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.12904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.12985: Set connection var ansible_timeout to 10 19110 1726882551.13004: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882551.13013: Set connection var ansible_shell_executable to /bin/sh 19110 1726882551.13019: Set connection var ansible_shell_type to sh 19110 1726882551.13024: Set connection var ansible_connection to ssh 19110 1726882551.13032: Set connection var ansible_pipelining to False 19110 1726882551.13053: variable 'ansible_shell_executable' from source: unknown 19110 1726882551.13065: variable 'ansible_connection' from source: unknown 19110 1726882551.13073: variable 'ansible_module_compression' from source: unknown 19110 1726882551.13079: variable 'ansible_shell_type' from source: unknown 19110 1726882551.13085: variable 'ansible_shell_executable' from source: unknown 19110 1726882551.13090: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.13101: variable 'ansible_pipelining' from source: unknown 19110 1726882551.13107: variable 'ansible_timeout' from source: unknown 19110 1726882551.13114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.13211: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882551.13223: variable 'omit' from source: magic vars 19110 1726882551.13231: starting attempt loop 19110 1726882551.13237: running the handler 19110 1726882551.13246: _low_level_execute_command(): starting 19110 1726882551.13253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882551.13893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.13907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.13923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.13940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.13986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.13999: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.14014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.14032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.14044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.14053: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.14067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.14083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.14098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.14108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.14118: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.14130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.14211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.14227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.14241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.14368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.15932: stdout chunk (state=3): >>>/root <<< 19110 1726882551.16122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.16125: stdout chunk (state=3): >>><<< 19110 1726882551.16127: stderr chunk (state=3): >>><<< 19110 1726882551.16225: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.16229: _low_level_execute_command(): starting 19110 1726882551.16232: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062 `" && echo ansible-tmp-1726882551.161431-19422-197955881722062="` echo /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062 `" ) && sleep 0' 19110 1726882551.16833: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.16847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.16862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.16896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.16940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.16952: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.16969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.16986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.16999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.17015: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.17027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.17040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.17056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.17070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.17081: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.17093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.17169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.17190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.17205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.17331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.19185: stdout chunk (state=3): >>>ansible-tmp-1726882551.161431-19422-197955881722062=/root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062 <<< 19110 1726882551.19368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.19371: stdout chunk (state=3): >>><<< 19110 1726882551.19380: stderr chunk (state=3): >>><<< 19110 1726882551.19396: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882551.161431-19422-197955881722062=/root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.19419: variable 'ansible_module_compression' from source: unknown 19110 1726882551.19462: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882551.19487: variable 'ansible_facts' from source: unknown 19110 1726882551.19555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/AnsiballZ_command.py 19110 1726882551.20082: Sending initial data 19110 1726882551.20086: Sent initial data (155 bytes) 19110 1726882551.21096: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.21099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.21119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.21154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.21163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882551.21171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.21185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882551.21190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.21272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.21290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.21409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.23115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882551.23216: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882551.23306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmplf96pomz /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/AnsiballZ_command.py <<< 19110 1726882551.23408: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882551.24793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.25021: stderr chunk (state=3): >>><<< 19110 1726882551.25024: stdout chunk (state=3): >>><<< 19110 1726882551.25027: done transferring module to remote 19110 1726882551.25029: _low_level_execute_command(): starting 19110 1726882551.25031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/ /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/AnsiballZ_command.py && sleep 0' 19110 1726882551.25600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.25613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.25626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.25642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.25689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.25704: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.25717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.25733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.25745: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.25757: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.25771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.25790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.25805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.25816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.25825: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.25837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.25920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.25940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.25954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.26078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.27882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.27885: stdout chunk (state=3): >>><<< 19110 1726882551.27888: stderr chunk (state=3): >>><<< 19110 1726882551.27984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.27988: _low_level_execute_command(): starting 19110 1726882551.27991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/AnsiballZ_command.py && sleep 0' 19110 1726882551.28646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.28682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.28697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.28716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.28758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.28778: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.28807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.28826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.28838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.28849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.28862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.28880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.28913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.28926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.28938: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.28952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.29049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.29074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.29090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.29247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.42905: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:35:51.420786", "end": "2024-09-20 21:35:51.426804", "delta": "0:00:00.006018", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882551.43970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882551.44019: stderr chunk (state=3): >>><<< 19110 1726882551.44022: stdout chunk (state=3): >>><<< 19110 1726882551.44036: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:35:51.420786", "end": "2024-09-20 21:35:51.426804", "delta": "0:00:00.006018", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882551.44059: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882551.44068: _low_level_execute_command(): starting 19110 1726882551.44070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882551.161431-19422-197955881722062/ > /dev/null 2>&1 && sleep 0' 19110 1726882551.44488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.44492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.44522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.44525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.44527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.44582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.44585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.44686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.46478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.46522: stderr chunk (state=3): >>><<< 19110 1726882551.46525: stdout chunk (state=3): >>><<< 19110 1726882551.46536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.46541: handler run complete 19110 1726882551.46557: Evaluated conditional (False): False 19110 1726882551.46565: attempt loop complete, returning result 19110 1726882551.46582: variable 'item' from source: unknown 19110 1726882551.46640: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.006018", "end": "2024-09-20 21:35:51.426804", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 21:35:51.420786" } 19110 1726882551.46758: dumping result to json 19110 1726882551.46761: done dumping result, returning 19110 1726882551.46763: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 [0e448fcc-3ce9-5372-c19a-000000000135] 19110 1726882551.46766: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000135 19110 1726882551.46808: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000135 19110 1726882551.46811: WORKER PROCESS EXITING 19110 1726882551.46875: no more pending results, returning what we have 19110 1726882551.46878: results queue empty 19110 1726882551.46879: checking for any_errors_fatal 19110 1726882551.46884: done checking for any_errors_fatal 19110 1726882551.46884: checking for max_fail_percentage 19110 1726882551.46886: done checking for max_fail_percentage 19110 1726882551.46887: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.46887: done checking to see if all hosts have failed 19110 1726882551.46888: getting the remaining hosts for this loop 19110 1726882551.46890: done getting the remaining hosts for this loop 19110 1726882551.46893: getting the next task for host managed_node1 19110 1726882551.46898: done getting next task for host managed_node1 19110 1726882551.46901: ^ task is: TASK: Set up veth as managed by NetworkManager 19110 1726882551.46903: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.46907: getting variables 19110 1726882551.46908: in VariableManager get_vars() 19110 1726882551.46935: Calling all_inventory to load vars for managed_node1 19110 1726882551.46938: Calling groups_inventory to load vars for managed_node1 19110 1726882551.46941: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.46950: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.46952: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.46957: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.47102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.47218: done with get_vars() 19110 1726882551.47226: done getting variables 19110 1726882551.47272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:35:51 -0400 (0:00:01.092) 0:00:08.330 ****** 19110 1726882551.47292: entering _queue_task() for managed_node1/command 19110 1726882551.47476: worker is 1 (out of 1 available) 19110 1726882551.47489: exiting _queue_task() for managed_node1/command 19110 1726882551.47500: done queuing things up, now waiting for results queue to drain 19110 1726882551.47501: waiting for pending results... 19110 1726882551.47647: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 19110 1726882551.47710: in run() - task 0e448fcc-3ce9-5372-c19a-000000000136 19110 1726882551.47722: variable 'ansible_search_path' from source: unknown 19110 1726882551.47725: variable 'ansible_search_path' from source: unknown 19110 1726882551.47751: calling self._execute() 19110 1726882551.47808: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.47811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.47819: variable 'omit' from source: magic vars 19110 1726882551.48074: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.48084: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.48186: variable 'type' from source: set_fact 19110 1726882551.48190: variable 'state' from source: include params 19110 1726882551.48195: Evaluated conditional (type == 'veth' and state == 'present'): True 19110 1726882551.48207: variable 'omit' from source: magic vars 19110 1726882551.48226: variable 'omit' from source: magic vars 19110 1726882551.48293: variable 'interface' from source: set_fact 19110 1726882551.48305: variable 'omit' from source: magic vars 19110 1726882551.48338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882551.48365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882551.48381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882551.48395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882551.48404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882551.48429: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882551.48432: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.48434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.48503: Set connection var ansible_timeout to 10 19110 1726882551.48512: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882551.48517: Set connection var ansible_shell_executable to /bin/sh 19110 1726882551.48520: Set connection var ansible_shell_type to sh 19110 1726882551.48523: Set connection var ansible_connection to ssh 19110 1726882551.48532: Set connection var ansible_pipelining to False 19110 1726882551.48545: variable 'ansible_shell_executable' from source: unknown 19110 1726882551.48547: variable 'ansible_connection' from source: unknown 19110 1726882551.48550: variable 'ansible_module_compression' from source: unknown 19110 1726882551.48552: variable 'ansible_shell_type' from source: unknown 19110 1726882551.48557: variable 'ansible_shell_executable' from source: unknown 19110 1726882551.48559: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.48562: variable 'ansible_pipelining' from source: unknown 19110 1726882551.48565: variable 'ansible_timeout' from source: unknown 19110 1726882551.48568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.48665: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882551.48673: variable 'omit' from source: magic vars 19110 1726882551.48678: starting attempt loop 19110 1726882551.48681: running the handler 19110 1726882551.48693: _low_level_execute_command(): starting 19110 1726882551.48699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882551.49195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.49210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.49223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882551.49236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.49245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.49298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.49316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.49408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.50966: stdout chunk (state=3): >>>/root <<< 19110 1726882551.51076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.51114: stderr chunk (state=3): >>><<< 19110 1726882551.51117: stdout chunk (state=3): >>><<< 19110 1726882551.51134: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.51144: _low_level_execute_command(): starting 19110 1726882551.51149: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319 `" && echo ansible-tmp-1726882551.511329-19487-15955550438319="` echo /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319 `" ) && sleep 0' 19110 1726882551.51562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.51578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.51599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.51611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.51663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.51681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.51770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.53609: stdout chunk (state=3): >>>ansible-tmp-1726882551.511329-19487-15955550438319=/root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319 <<< 19110 1726882551.53716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.53758: stderr chunk (state=3): >>><<< 19110 1726882551.53769: stdout chunk (state=3): >>><<< 19110 1726882551.53785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882551.511329-19487-15955550438319=/root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.53810: variable 'ansible_module_compression' from source: unknown 19110 1726882551.53848: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882551.53880: variable 'ansible_facts' from source: unknown 19110 1726882551.53941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/AnsiballZ_command.py 19110 1726882551.54047: Sending initial data 19110 1726882551.54057: Sent initial data (154 bytes) 19110 1726882551.54717: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.54720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.54759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.54763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.54765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.54819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.54822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.54826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.54918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.56629: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882551.56740: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882551.56941: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpcs2dobha /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/AnsiballZ_command.py <<< 19110 1726882551.56944: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882551.58232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.58369: stderr chunk (state=3): >>><<< 19110 1726882551.58372: stdout chunk (state=3): >>><<< 19110 1726882551.58375: done transferring module to remote 19110 1726882551.58377: _low_level_execute_command(): starting 19110 1726882551.58379: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/ /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/AnsiballZ_command.py && sleep 0' 19110 1726882551.58794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.58798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.58836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.58839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.58841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.58889: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.58892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.58990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.60821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.60940: stderr chunk (state=3): >>><<< 19110 1726882551.60946: stdout chunk (state=3): >>><<< 19110 1726882551.60971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.60974: _low_level_execute_command(): starting 19110 1726882551.60977: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/AnsiballZ_command.py && sleep 0' 19110 1726882551.62557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.62565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.63280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.63294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.63331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.63339: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.63349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.63362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882551.63372: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882551.63378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882551.63386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.63395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.63407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.63414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.63421: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882551.63431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.63505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.63522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882551.63534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.63659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.78588: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:35:51.764874", "end": "2024-09-20 21:35:51.784378", "delta": "0:00:00.019504", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882551.79789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882551.79881: stderr chunk (state=3): >>><<< 19110 1726882551.79885: stdout chunk (state=3): >>><<< 19110 1726882551.80028: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:35:51.764874", "end": "2024-09-20 21:35:51.784378", "delta": "0:00:00.019504", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882551.80033: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882551.80036: _low_level_execute_command(): starting 19110 1726882551.80038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882551.511329-19487-15955550438319/ > /dev/null 2>&1 && sleep 0' 19110 1726882551.81432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882551.81481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.81496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882551.81515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882551.81588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882551.81671: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882551.81684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882551.81686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882551.81792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882551.81795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882551.81906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882551.83705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882551.83786: stderr chunk (state=3): >>><<< 19110 1726882551.83789: stdout chunk (state=3): >>><<< 19110 1726882551.83972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882551.83976: handler run complete 19110 1726882551.83979: Evaluated conditional (False): False 19110 1726882551.83981: attempt loop complete, returning result 19110 1726882551.83983: _execute() done 19110 1726882551.83985: dumping result to json 19110 1726882551.83987: done dumping result, returning 19110 1726882551.83988: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-5372-c19a-000000000136] 19110 1726882551.83990: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000136 ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.019504", "end": "2024-09-20 21:35:51.784378", "rc": 0, "start": "2024-09-20 21:35:51.764874" } 19110 1726882551.84130: no more pending results, returning what we have 19110 1726882551.84133: results queue empty 19110 1726882551.84134: checking for any_errors_fatal 19110 1726882551.84144: done checking for any_errors_fatal 19110 1726882551.84145: checking for max_fail_percentage 19110 1726882551.84146: done checking for max_fail_percentage 19110 1726882551.84147: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.84148: done checking to see if all hosts have failed 19110 1726882551.84149: getting the remaining hosts for this loop 19110 1726882551.84151: done getting the remaining hosts for this loop 19110 1726882551.84157: getting the next task for host managed_node1 19110 1726882551.84163: done getting next task for host managed_node1 19110 1726882551.84166: ^ task is: TASK: Delete veth interface {{ interface }} 19110 1726882551.84170: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.84174: getting variables 19110 1726882551.84175: in VariableManager get_vars() 19110 1726882551.84204: Calling all_inventory to load vars for managed_node1 19110 1726882551.84206: Calling groups_inventory to load vars for managed_node1 19110 1726882551.84209: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.84220: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.84222: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.84225: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.84383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.84585: done with get_vars() 19110 1726882551.84596: done getting variables 19110 1726882551.84628: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000136 19110 1726882551.84631: WORKER PROCESS EXITING 19110 1726882551.84669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882551.84789: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:35:51 -0400 (0:00:00.375) 0:00:08.705 ****** 19110 1726882551.84819: entering _queue_task() for managed_node1/command 19110 1726882551.85426: worker is 1 (out of 1 available) 19110 1726882551.85444: exiting _queue_task() for managed_node1/command 19110 1726882551.85456: done queuing things up, now waiting for results queue to drain 19110 1726882551.85457: waiting for pending results... 19110 1726882551.85709: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 19110 1726882551.85805: in run() - task 0e448fcc-3ce9-5372-c19a-000000000137 19110 1726882551.85822: variable 'ansible_search_path' from source: unknown 19110 1726882551.85828: variable 'ansible_search_path' from source: unknown 19110 1726882551.85872: calling self._execute() 19110 1726882551.85949: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.85960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.85977: variable 'omit' from source: magic vars 19110 1726882551.86356: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.86380: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.86598: variable 'type' from source: set_fact 19110 1726882551.86608: variable 'state' from source: include params 19110 1726882551.86616: variable 'interface' from source: set_fact 19110 1726882551.86624: variable 'current_interfaces' from source: set_fact 19110 1726882551.86641: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 19110 1726882551.86649: when evaluation is False, skipping this task 19110 1726882551.86659: _execute() done 19110 1726882551.86669: dumping result to json 19110 1726882551.86677: done dumping result, returning 19110 1726882551.86687: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 [0e448fcc-3ce9-5372-c19a-000000000137] 19110 1726882551.86697: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000137 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 19110 1726882551.86839: no more pending results, returning what we have 19110 1726882551.86843: results queue empty 19110 1726882551.86844: checking for any_errors_fatal 19110 1726882551.86852: done checking for any_errors_fatal 19110 1726882551.86853: checking for max_fail_percentage 19110 1726882551.86855: done checking for max_fail_percentage 19110 1726882551.86856: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.86857: done checking to see if all hosts have failed 19110 1726882551.86858: getting the remaining hosts for this loop 19110 1726882551.86859: done getting the remaining hosts for this loop 19110 1726882551.86865: getting the next task for host managed_node1 19110 1726882551.86872: done getting next task for host managed_node1 19110 1726882551.86875: ^ task is: TASK: Create dummy interface {{ interface }} 19110 1726882551.86879: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.86883: getting variables 19110 1726882551.86885: in VariableManager get_vars() 19110 1726882551.86918: Calling all_inventory to load vars for managed_node1 19110 1726882551.86922: Calling groups_inventory to load vars for managed_node1 19110 1726882551.86925: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.86940: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.86943: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.86946: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.87207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.87887: done with get_vars() 19110 1726882551.87897: done getting variables 19110 1726882551.88082: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000137 19110 1726882551.88085: WORKER PROCESS EXITING 19110 1726882551.88125: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882551.88347: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:35:51 -0400 (0:00:00.035) 0:00:08.741 ****** 19110 1726882551.88379: entering _queue_task() for managed_node1/command 19110 1726882551.88634: worker is 1 (out of 1 available) 19110 1726882551.88645: exiting _queue_task() for managed_node1/command 19110 1726882551.88657: done queuing things up, now waiting for results queue to drain 19110 1726882551.88658: waiting for pending results... 19110 1726882551.88913: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 19110 1726882551.89015: in run() - task 0e448fcc-3ce9-5372-c19a-000000000138 19110 1726882551.89033: variable 'ansible_search_path' from source: unknown 19110 1726882551.89039: variable 'ansible_search_path' from source: unknown 19110 1726882551.89084: calling self._execute() 19110 1726882551.89164: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.89181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.89196: variable 'omit' from source: magic vars 19110 1726882551.89691: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.89742: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.89975: variable 'type' from source: set_fact 19110 1726882551.89986: variable 'state' from source: include params 19110 1726882551.89995: variable 'interface' from source: set_fact 19110 1726882551.90003: variable 'current_interfaces' from source: set_fact 19110 1726882551.90015: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 19110 1726882551.90022: when evaluation is False, skipping this task 19110 1726882551.90029: _execute() done 19110 1726882551.90034: dumping result to json 19110 1726882551.90042: done dumping result, returning 19110 1726882551.90052: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 [0e448fcc-3ce9-5372-c19a-000000000138] 19110 1726882551.90070: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000138 19110 1726882551.90177: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000138 19110 1726882551.90187: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 19110 1726882551.90237: no more pending results, returning what we have 19110 1726882551.90241: results queue empty 19110 1726882551.90242: checking for any_errors_fatal 19110 1726882551.90247: done checking for any_errors_fatal 19110 1726882551.90248: checking for max_fail_percentage 19110 1726882551.90250: done checking for max_fail_percentage 19110 1726882551.90251: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.90252: done checking to see if all hosts have failed 19110 1726882551.90253: getting the remaining hosts for this loop 19110 1726882551.90254: done getting the remaining hosts for this loop 19110 1726882551.90258: getting the next task for host managed_node1 19110 1726882551.90267: done getting next task for host managed_node1 19110 1726882551.90270: ^ task is: TASK: Delete dummy interface {{ interface }} 19110 1726882551.90273: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.90279: getting variables 19110 1726882551.90280: in VariableManager get_vars() 19110 1726882551.90309: Calling all_inventory to load vars for managed_node1 19110 1726882551.90312: Calling groups_inventory to load vars for managed_node1 19110 1726882551.90315: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.90328: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.90331: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.90334: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.90522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.90724: done with get_vars() 19110 1726882551.90736: done getting variables 19110 1726882551.90812: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882551.90927: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:35:51 -0400 (0:00:00.025) 0:00:08.766 ****** 19110 1726882551.90955: entering _queue_task() for managed_node1/command 19110 1726882551.91141: worker is 1 (out of 1 available) 19110 1726882551.91154: exiting _queue_task() for managed_node1/command 19110 1726882551.91167: done queuing things up, now waiting for results queue to drain 19110 1726882551.91169: waiting for pending results... 19110 1726882551.91311: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 19110 1726882551.91398: in run() - task 0e448fcc-3ce9-5372-c19a-000000000139 19110 1726882551.91421: variable 'ansible_search_path' from source: unknown 19110 1726882551.91442: variable 'ansible_search_path' from source: unknown 19110 1726882551.91511: calling self._execute() 19110 1726882551.91627: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.91675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.91692: variable 'omit' from source: magic vars 19110 1726882551.92893: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.92924: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.93186: variable 'type' from source: set_fact 19110 1726882551.93195: variable 'state' from source: include params 19110 1726882551.93202: variable 'interface' from source: set_fact 19110 1726882551.93209: variable 'current_interfaces' from source: set_fact 19110 1726882551.93220: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 19110 1726882551.93225: when evaluation is False, skipping this task 19110 1726882551.93237: _execute() done 19110 1726882551.93244: dumping result to json 19110 1726882551.93250: done dumping result, returning 19110 1726882551.93258: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 [0e448fcc-3ce9-5372-c19a-000000000139] 19110 1726882551.93271: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000139 19110 1726882551.93499: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000139 19110 1726882551.93507: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 19110 1726882551.93574: no more pending results, returning what we have 19110 1726882551.93578: results queue empty 19110 1726882551.93579: checking for any_errors_fatal 19110 1726882551.93585: done checking for any_errors_fatal 19110 1726882551.93586: checking for max_fail_percentage 19110 1726882551.93588: done checking for max_fail_percentage 19110 1726882551.93589: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.93590: done checking to see if all hosts have failed 19110 1726882551.93590: getting the remaining hosts for this loop 19110 1726882551.93592: done getting the remaining hosts for this loop 19110 1726882551.93596: getting the next task for host managed_node1 19110 1726882551.93601: done getting next task for host managed_node1 19110 1726882551.93604: ^ task is: TASK: Create tap interface {{ interface }} 19110 1726882551.93608: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.93612: getting variables 19110 1726882551.93614: in VariableManager get_vars() 19110 1726882551.93644: Calling all_inventory to load vars for managed_node1 19110 1726882551.93646: Calling groups_inventory to load vars for managed_node1 19110 1726882551.93650: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.93662: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.93667: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.93670: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.93901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.94072: done with get_vars() 19110 1726882551.94085: done getting variables 19110 1726882551.94150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882551.94415: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:35:51 -0400 (0:00:00.034) 0:00:08.801 ****** 19110 1726882551.94448: entering _queue_task() for managed_node1/command 19110 1726882551.94760: worker is 1 (out of 1 available) 19110 1726882551.94773: exiting _queue_task() for managed_node1/command 19110 1726882551.94784: done queuing things up, now waiting for results queue to drain 19110 1726882551.94785: waiting for pending results... 19110 1726882551.95012: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 19110 1726882551.95118: in run() - task 0e448fcc-3ce9-5372-c19a-00000000013a 19110 1726882551.95143: variable 'ansible_search_path' from source: unknown 19110 1726882551.95152: variable 'ansible_search_path' from source: unknown 19110 1726882551.95197: calling self._execute() 19110 1726882551.95275: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.95292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.95307: variable 'omit' from source: magic vars 19110 1726882551.95670: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.95689: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.96056: variable 'type' from source: set_fact 19110 1726882551.96076: variable 'state' from source: include params 19110 1726882551.96086: variable 'interface' from source: set_fact 19110 1726882551.96094: variable 'current_interfaces' from source: set_fact 19110 1726882551.96118: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 19110 1726882551.96127: when evaluation is False, skipping this task 19110 1726882551.96134: _execute() done 19110 1726882551.96140: dumping result to json 19110 1726882551.96148: done dumping result, returning 19110 1726882551.96158: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 [0e448fcc-3ce9-5372-c19a-00000000013a] 19110 1726882551.96175: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000013a skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 19110 1726882551.96308: no more pending results, returning what we have 19110 1726882551.96312: results queue empty 19110 1726882551.96314: checking for any_errors_fatal 19110 1726882551.96319: done checking for any_errors_fatal 19110 1726882551.96320: checking for max_fail_percentage 19110 1726882551.96322: done checking for max_fail_percentage 19110 1726882551.96323: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.96324: done checking to see if all hosts have failed 19110 1726882551.96324: getting the remaining hosts for this loop 19110 1726882551.96326: done getting the remaining hosts for this loop 19110 1726882551.96329: getting the next task for host managed_node1 19110 1726882551.96336: done getting next task for host managed_node1 19110 1726882551.96339: ^ task is: TASK: Delete tap interface {{ interface }} 19110 1726882551.96343: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.96346: getting variables 19110 1726882551.96348: in VariableManager get_vars() 19110 1726882551.96379: Calling all_inventory to load vars for managed_node1 19110 1726882551.96382: Calling groups_inventory to load vars for managed_node1 19110 1726882551.96388: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.96401: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.96404: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.96407: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.96568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.96765: done with get_vars() 19110 1726882551.96774: done getting variables 19110 1726882551.96808: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000013a 19110 1726882551.96811: WORKER PROCESS EXITING 19110 1726882551.96845: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882551.96954: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:35:51 -0400 (0:00:00.025) 0:00:08.827 ****** 19110 1726882551.96984: entering _queue_task() for managed_node1/command 19110 1726882551.97217: worker is 1 (out of 1 available) 19110 1726882551.97235: exiting _queue_task() for managed_node1/command 19110 1726882551.97248: done queuing things up, now waiting for results queue to drain 19110 1726882551.97250: waiting for pending results... 19110 1726882551.97512: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 19110 1726882551.97619: in run() - task 0e448fcc-3ce9-5372-c19a-00000000013b 19110 1726882551.97639: variable 'ansible_search_path' from source: unknown 19110 1726882551.97646: variable 'ansible_search_path' from source: unknown 19110 1726882551.97721: calling self._execute() 19110 1726882551.97804: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882551.97814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882551.97827: variable 'omit' from source: magic vars 19110 1726882551.98188: variable 'ansible_distribution_major_version' from source: facts 19110 1726882551.98206: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882551.98422: variable 'type' from source: set_fact 19110 1726882551.98432: variable 'state' from source: include params 19110 1726882551.98443: variable 'interface' from source: set_fact 19110 1726882551.98458: variable 'current_interfaces' from source: set_fact 19110 1726882551.98472: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 19110 1726882551.98479: when evaluation is False, skipping this task 19110 1726882551.98487: _execute() done 19110 1726882551.98494: dumping result to json 19110 1726882551.98502: done dumping result, returning 19110 1726882551.98512: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 [0e448fcc-3ce9-5372-c19a-00000000013b] 19110 1726882551.98523: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000013b skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 19110 1726882551.98659: no more pending results, returning what we have 19110 1726882551.98670: results queue empty 19110 1726882551.98671: checking for any_errors_fatal 19110 1726882551.98677: done checking for any_errors_fatal 19110 1726882551.98678: checking for max_fail_percentage 19110 1726882551.98680: done checking for max_fail_percentage 19110 1726882551.98681: checking to see if all hosts have failed and the running result is not ok 19110 1726882551.98682: done checking to see if all hosts have failed 19110 1726882551.98682: getting the remaining hosts for this loop 19110 1726882551.98684: done getting the remaining hosts for this loop 19110 1726882551.98688: getting the next task for host managed_node1 19110 1726882551.98697: done getting next task for host managed_node1 19110 1726882551.98700: ^ task is: TASK: Include the task 'assert_device_present.yml' 19110 1726882551.98704: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882551.98709: getting variables 19110 1726882551.98710: in VariableManager get_vars() 19110 1726882551.98741: Calling all_inventory to load vars for managed_node1 19110 1726882551.98745: Calling groups_inventory to load vars for managed_node1 19110 1726882551.98749: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882551.98766: Calling all_plugins_play to load vars for managed_node1 19110 1726882551.98769: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882551.98772: Calling groups_plugins_play to load vars for managed_node1 19110 1726882551.99241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882551.99700: done with get_vars() 19110 1726882551.99708: done getting variables 19110 1726882551.99742: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000013b 19110 1726882551.99745: WORKER PROCESS EXITING TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 21:35:51 -0400 (0:00:00.028) 0:00:08.855 ****** 19110 1726882551.99811: entering _queue_task() for managed_node1/include_tasks 19110 1726882552.00044: worker is 1 (out of 1 available) 19110 1726882552.00067: exiting _queue_task() for managed_node1/include_tasks 19110 1726882552.00084: done queuing things up, now waiting for results queue to drain 19110 1726882552.00086: waiting for pending results... 19110 1726882552.00375: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 19110 1726882552.00471: in run() - task 0e448fcc-3ce9-5372-c19a-000000000012 19110 1726882552.00493: variable 'ansible_search_path' from source: unknown 19110 1726882552.00542: calling self._execute() 19110 1726882552.00626: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.00637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.00651: variable 'omit' from source: magic vars 19110 1726882552.01028: variable 'ansible_distribution_major_version' from source: facts 19110 1726882552.01051: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882552.01065: _execute() done 19110 1726882552.01075: dumping result to json 19110 1726882552.01083: done dumping result, returning 19110 1726882552.01095: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-5372-c19a-000000000012] 19110 1726882552.01109: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000012 19110 1726882552.01245: no more pending results, returning what we have 19110 1726882552.01250: in VariableManager get_vars() 19110 1726882552.01284: Calling all_inventory to load vars for managed_node1 19110 1726882552.01287: Calling groups_inventory to load vars for managed_node1 19110 1726882552.01290: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.01302: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.01305: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.01308: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.01483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.01669: done with get_vars() 19110 1726882552.01676: variable 'ansible_search_path' from source: unknown 19110 1726882552.01689: we have included files to process 19110 1726882552.01690: generating all_blocks data 19110 1726882552.01693: done generating all_blocks data 19110 1726882552.01697: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19110 1726882552.01698: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19110 1726882552.01700: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19110 1726882552.01867: in VariableManager get_vars() 19110 1726882552.01969: done with get_vars() 19110 1726882552.02116: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000012 19110 1726882552.02120: WORKER PROCESS EXITING 19110 1726882552.02229: done processing included file 19110 1726882552.02231: iterating over new_blocks loaded from include file 19110 1726882552.02232: in VariableManager get_vars() 19110 1726882552.02243: done with get_vars() 19110 1726882552.02245: filtering new block on tags 19110 1726882552.02266: done filtering new block on tags 19110 1726882552.02269: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 19110 1726882552.02273: extending task lists for all hosts with included blocks 19110 1726882552.02854: done extending task lists 19110 1726882552.02856: done processing included files 19110 1726882552.02857: results queue empty 19110 1726882552.02857: checking for any_errors_fatal 19110 1726882552.02865: done checking for any_errors_fatal 19110 1726882552.02866: checking for max_fail_percentage 19110 1726882552.02867: done checking for max_fail_percentage 19110 1726882552.02868: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.02868: done checking to see if all hosts have failed 19110 1726882552.02869: getting the remaining hosts for this loop 19110 1726882552.02871: done getting the remaining hosts for this loop 19110 1726882552.02874: getting the next task for host managed_node1 19110 1726882552.02877: done getting next task for host managed_node1 19110 1726882552.02879: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19110 1726882552.02881: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.02883: getting variables 19110 1726882552.02884: in VariableManager get_vars() 19110 1726882552.02891: Calling all_inventory to load vars for managed_node1 19110 1726882552.02892: Calling groups_inventory to load vars for managed_node1 19110 1726882552.02894: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.02899: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.02901: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.02903: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.03025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.03203: done with get_vars() 19110 1726882552.03210: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:35:52 -0400 (0:00:00.034) 0:00:08.889 ****** 19110 1726882552.03272: entering _queue_task() for managed_node1/include_tasks 19110 1726882552.03489: worker is 1 (out of 1 available) 19110 1726882552.03500: exiting _queue_task() for managed_node1/include_tasks 19110 1726882552.03514: done queuing things up, now waiting for results queue to drain 19110 1726882552.03515: waiting for pending results... 19110 1726882552.03744: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 19110 1726882552.03827: in run() - task 0e448fcc-3ce9-5372-c19a-0000000001d3 19110 1726882552.03847: variable 'ansible_search_path' from source: unknown 19110 1726882552.03855: variable 'ansible_search_path' from source: unknown 19110 1726882552.03892: calling self._execute() 19110 1726882552.03970: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.03980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.03991: variable 'omit' from source: magic vars 19110 1726882552.04336: variable 'ansible_distribution_major_version' from source: facts 19110 1726882552.04354: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882552.04367: _execute() done 19110 1726882552.04376: dumping result to json 19110 1726882552.04387: done dumping result, returning 19110 1726882552.04396: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-5372-c19a-0000000001d3] 19110 1726882552.04412: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000001d3 19110 1726882552.04533: no more pending results, returning what we have 19110 1726882552.04538: in VariableManager get_vars() 19110 1726882552.04573: Calling all_inventory to load vars for managed_node1 19110 1726882552.04576: Calling groups_inventory to load vars for managed_node1 19110 1726882552.04580: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.04592: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.04595: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.04599: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.04785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.05012: done with get_vars() 19110 1726882552.05019: variable 'ansible_search_path' from source: unknown 19110 1726882552.05020: variable 'ansible_search_path' from source: unknown 19110 1726882552.05057: we have included files to process 19110 1726882552.05058: generating all_blocks data 19110 1726882552.05060: done generating all_blocks data 19110 1726882552.05062: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882552.05065: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882552.05068: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882552.05407: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000001d3 19110 1726882552.05412: WORKER PROCESS EXITING 19110 1726882552.05509: done processing included file 19110 1726882552.05511: iterating over new_blocks loaded from include file 19110 1726882552.05513: in VariableManager get_vars() 19110 1726882552.05531: done with get_vars() 19110 1726882552.05532: filtering new block on tags 19110 1726882552.05547: done filtering new block on tags 19110 1726882552.05549: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 19110 1726882552.05554: extending task lists for all hosts with included blocks 19110 1726882552.05659: done extending task lists 19110 1726882552.05660: done processing included files 19110 1726882552.05661: results queue empty 19110 1726882552.05662: checking for any_errors_fatal 19110 1726882552.05666: done checking for any_errors_fatal 19110 1726882552.05667: checking for max_fail_percentage 19110 1726882552.05668: done checking for max_fail_percentage 19110 1726882552.05669: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.05670: done checking to see if all hosts have failed 19110 1726882552.05671: getting the remaining hosts for this loop 19110 1726882552.05672: done getting the remaining hosts for this loop 19110 1726882552.05675: getting the next task for host managed_node1 19110 1726882552.05678: done getting next task for host managed_node1 19110 1726882552.05680: ^ task is: TASK: Get stat for interface {{ interface }} 19110 1726882552.05683: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.05686: getting variables 19110 1726882552.05687: in VariableManager get_vars() 19110 1726882552.05694: Calling all_inventory to load vars for managed_node1 19110 1726882552.05696: Calling groups_inventory to load vars for managed_node1 19110 1726882552.05699: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.05703: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.05705: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.05708: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.05852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.06045: done with get_vars() 19110 1726882552.06053: done getting variables 19110 1726882552.06207: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:52 -0400 (0:00:00.029) 0:00:08.919 ****** 19110 1726882552.06235: entering _queue_task() for managed_node1/stat 19110 1726882552.06449: worker is 1 (out of 1 available) 19110 1726882552.06460: exiting _queue_task() for managed_node1/stat 19110 1726882552.06473: done queuing things up, now waiting for results queue to drain 19110 1726882552.06475: waiting for pending results... 19110 1726882552.06715: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 19110 1726882552.06831: in run() - task 0e448fcc-3ce9-5372-c19a-00000000021e 19110 1726882552.06854: variable 'ansible_search_path' from source: unknown 19110 1726882552.06866: variable 'ansible_search_path' from source: unknown 19110 1726882552.06904: calling self._execute() 19110 1726882552.06985: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.06996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.07010: variable 'omit' from source: magic vars 19110 1726882552.07445: variable 'ansible_distribution_major_version' from source: facts 19110 1726882552.07468: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882552.07481: variable 'omit' from source: magic vars 19110 1726882552.07529: variable 'omit' from source: magic vars 19110 1726882552.07628: variable 'interface' from source: set_fact 19110 1726882552.07648: variable 'omit' from source: magic vars 19110 1726882552.07698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882552.07741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882552.07767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882552.07793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.07808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.07846: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882552.07855: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.07863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.07973: Set connection var ansible_timeout to 10 19110 1726882552.07991: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882552.08002: Set connection var ansible_shell_executable to /bin/sh 19110 1726882552.08012: Set connection var ansible_shell_type to sh 19110 1726882552.08018: Set connection var ansible_connection to ssh 19110 1726882552.08027: Set connection var ansible_pipelining to False 19110 1726882552.08055: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.08066: variable 'ansible_connection' from source: unknown 19110 1726882552.08074: variable 'ansible_module_compression' from source: unknown 19110 1726882552.08081: variable 'ansible_shell_type' from source: unknown 19110 1726882552.08087: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.08093: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.08101: variable 'ansible_pipelining' from source: unknown 19110 1726882552.08107: variable 'ansible_timeout' from source: unknown 19110 1726882552.08120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.08325: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882552.08344: variable 'omit' from source: magic vars 19110 1726882552.08357: starting attempt loop 19110 1726882552.08369: running the handler 19110 1726882552.08387: _low_level_execute_command(): starting 19110 1726882552.08399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882552.09182: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.09197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.09217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.09241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.09288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.09301: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.09321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.09341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.09358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.09373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.09387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.09402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.09419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.09436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.09449: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.09472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.09552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.09583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.09601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.09731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.11371: stdout chunk (state=3): >>>/root <<< 19110 1726882552.11473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.11562: stderr chunk (state=3): >>><<< 19110 1726882552.11577: stdout chunk (state=3): >>><<< 19110 1726882552.11703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.11706: _low_level_execute_command(): starting 19110 1726882552.11710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070 `" && echo ansible-tmp-1726882552.1160624-19548-221030287472070="` echo /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070 `" ) && sleep 0' 19110 1726882552.12317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.12333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.12354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.12376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.12440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.12454: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.12476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.12494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.12506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.12516: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.12527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.12542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.12556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.12574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.12585: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.12597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.12676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.12701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.12717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.12841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.14705: stdout chunk (state=3): >>>ansible-tmp-1726882552.1160624-19548-221030287472070=/root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070 <<< 19110 1726882552.14879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.14912: stderr chunk (state=3): >>><<< 19110 1726882552.14915: stdout chunk (state=3): >>><<< 19110 1726882552.15193: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882552.1160624-19548-221030287472070=/root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.15197: variable 'ansible_module_compression' from source: unknown 19110 1726882552.15200: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19110 1726882552.15202: variable 'ansible_facts' from source: unknown 19110 1726882552.15493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/AnsiballZ_stat.py 19110 1726882552.15650: Sending initial data 19110 1726882552.15653: Sent initial data (153 bytes) 19110 1726882552.16695: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.16718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.16733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.16768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.16812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.16832: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.16870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.16897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.16924: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.16934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.16944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.16960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.16985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.17004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.17015: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.17029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.17129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.17149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.17175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.17301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.19024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882552.19113: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882552.19208: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp83dgn4jp /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/AnsiballZ_stat.py <<< 19110 1726882552.19298: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882552.20838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.20943: stderr chunk (state=3): >>><<< 19110 1726882552.20946: stdout chunk (state=3): >>><<< 19110 1726882552.20948: done transferring module to remote 19110 1726882552.20951: _low_level_execute_command(): starting 19110 1726882552.20957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/ /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/AnsiballZ_stat.py && sleep 0' 19110 1726882552.22159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.22879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.22895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.22914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.22956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.22971: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.22987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.23005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.23018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.23030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.23043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.23057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.23077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.23090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.23102: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.23116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.23194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.23215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.23232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.23355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.25156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.25244: stdout chunk (state=3): >>><<< 19110 1726882552.25248: stderr chunk (state=3): >>><<< 19110 1726882552.25250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.25253: _low_level_execute_command(): starting 19110 1726882552.25255: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/AnsiballZ_stat.py && sleep 0' 19110 1726882552.27262: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.27268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.27306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.27310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.27313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.27557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.27560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.27716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.40759: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27187, "dev": 21, "nlink": 1, "atime": 1726882550.7224636, "mtime": 1726882550.7224636, "ctime": 1726882550.7224636, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19110 1726882552.41748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882552.41752: stdout chunk (state=3): >>><<< 19110 1726882552.41762: stderr chunk (state=3): >>><<< 19110 1726882552.41783: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27187, "dev": 21, "nlink": 1, "atime": 1726882550.7224636, "mtime": 1726882550.7224636, "ctime": 1726882550.7224636, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882552.41839: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882552.41847: _low_level_execute_command(): starting 19110 1726882552.41852: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882552.1160624-19548-221030287472070/ > /dev/null 2>&1 && sleep 0' 19110 1726882552.42542: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.42550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.42566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.42580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.42619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.42627: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.42638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.42650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.42663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.42673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.42681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.42691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.42703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.42710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.42717: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.42727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.42805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.42884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.42895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.43188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.45058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.45066: stdout chunk (state=3): >>><<< 19110 1726882552.45075: stderr chunk (state=3): >>><<< 19110 1726882552.45099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.45105: handler run complete 19110 1726882552.45169: attempt loop complete, returning result 19110 1726882552.45172: _execute() done 19110 1726882552.45175: dumping result to json 19110 1726882552.45180: done dumping result, returning 19110 1726882552.45190: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0e448fcc-3ce9-5372-c19a-00000000021e] 19110 1726882552.45195: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000021e 19110 1726882552.45321: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000021e 19110 1726882552.45324: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882550.7224636, "block_size": 4096, "blocks": 0, "ctime": 1726882550.7224636, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27187, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726882550.7224636, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 19110 1726882552.45460: no more pending results, returning what we have 19110 1726882552.45466: results queue empty 19110 1726882552.45467: checking for any_errors_fatal 19110 1726882552.45468: done checking for any_errors_fatal 19110 1726882552.45469: checking for max_fail_percentage 19110 1726882552.45471: done checking for max_fail_percentage 19110 1726882552.45472: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.45473: done checking to see if all hosts have failed 19110 1726882552.45473: getting the remaining hosts for this loop 19110 1726882552.45475: done getting the remaining hosts for this loop 19110 1726882552.45479: getting the next task for host managed_node1 19110 1726882552.45486: done getting next task for host managed_node1 19110 1726882552.45490: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 19110 1726882552.45494: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.45501: getting variables 19110 1726882552.45503: in VariableManager get_vars() 19110 1726882552.45533: Calling all_inventory to load vars for managed_node1 19110 1726882552.45540: Calling groups_inventory to load vars for managed_node1 19110 1726882552.45544: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.45558: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.45562: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.45571: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.46451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.46993: done with get_vars() 19110 1726882552.47004: done getting variables 19110 1726882552.47311: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 19110 1726882552.47543: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:35:52 -0400 (0:00:00.413) 0:00:09.333 ****** 19110 1726882552.47737: entering _queue_task() for managed_node1/assert 19110 1726882552.47740: Creating lock for assert 19110 1726882552.48298: worker is 1 (out of 1 available) 19110 1726882552.48311: exiting _queue_task() for managed_node1/assert 19110 1726882552.48330: done queuing things up, now waiting for results queue to drain 19110 1726882552.48332: waiting for pending results... 19110 1726882552.49054: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' 19110 1726882552.49889: in run() - task 0e448fcc-3ce9-5372-c19a-0000000001d4 19110 1726882552.49917: variable 'ansible_search_path' from source: unknown 19110 1726882552.49926: variable 'ansible_search_path' from source: unknown 19110 1726882552.49996: calling self._execute() 19110 1726882552.50091: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.50104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.50120: variable 'omit' from source: magic vars 19110 1726882552.50567: variable 'ansible_distribution_major_version' from source: facts 19110 1726882552.51294: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882552.51302: variable 'omit' from source: magic vars 19110 1726882552.51378: variable 'omit' from source: magic vars 19110 1726882552.51490: variable 'interface' from source: set_fact 19110 1726882552.51513: variable 'omit' from source: magic vars 19110 1726882552.51574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882552.51649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882552.51680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882552.51704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.51721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.51757: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882552.51769: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.51778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.51923: Set connection var ansible_timeout to 10 19110 1726882552.52720: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882552.52738: Set connection var ansible_shell_executable to /bin/sh 19110 1726882552.52747: Set connection var ansible_shell_type to sh 19110 1726882552.52753: Set connection var ansible_connection to ssh 19110 1726882552.52765: Set connection var ansible_pipelining to False 19110 1726882552.52798: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.52806: variable 'ansible_connection' from source: unknown 19110 1726882552.52814: variable 'ansible_module_compression' from source: unknown 19110 1726882552.52820: variable 'ansible_shell_type' from source: unknown 19110 1726882552.52826: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.52832: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.52840: variable 'ansible_pipelining' from source: unknown 19110 1726882552.52847: variable 'ansible_timeout' from source: unknown 19110 1726882552.52854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.53054: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882552.53075: variable 'omit' from source: magic vars 19110 1726882552.53086: starting attempt loop 19110 1726882552.53094: running the handler 19110 1726882552.53271: variable 'interface_stat' from source: set_fact 19110 1726882552.53322: Evaluated conditional (interface_stat.stat.exists): True 19110 1726882552.53333: handler run complete 19110 1726882552.53354: attempt loop complete, returning result 19110 1726882552.53361: _execute() done 19110 1726882552.53372: dumping result to json 19110 1726882552.53380: done dumping result, returning 19110 1726882552.53393: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' [0e448fcc-3ce9-5372-c19a-0000000001d4] 19110 1726882552.53405: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000001d4 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 19110 1726882552.53599: no more pending results, returning what we have 19110 1726882552.53603: results queue empty 19110 1726882552.53604: checking for any_errors_fatal 19110 1726882552.53612: done checking for any_errors_fatal 19110 1726882552.53612: checking for max_fail_percentage 19110 1726882552.53614: done checking for max_fail_percentage 19110 1726882552.53615: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.53616: done checking to see if all hosts have failed 19110 1726882552.53617: getting the remaining hosts for this loop 19110 1726882552.53618: done getting the remaining hosts for this loop 19110 1726882552.53622: getting the next task for host managed_node1 19110 1726882552.53631: done getting next task for host managed_node1 19110 1726882552.53633: ^ task is: TASK: meta (flush_handlers) 19110 1726882552.53635: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.53640: getting variables 19110 1726882552.53642: in VariableManager get_vars() 19110 1726882552.53673: Calling all_inventory to load vars for managed_node1 19110 1726882552.53676: Calling groups_inventory to load vars for managed_node1 19110 1726882552.53680: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.53698: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.53702: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.53705: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.53945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.54200: done with get_vars() 19110 1726882552.54226: done getting variables 19110 1726882552.54569: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000001d4 19110 1726882552.54573: WORKER PROCESS EXITING 19110 1726882552.54639: in VariableManager get_vars() 19110 1726882552.54649: Calling all_inventory to load vars for managed_node1 19110 1726882552.54651: Calling groups_inventory to load vars for managed_node1 19110 1726882552.54654: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.54682: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.54689: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.54693: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.54901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.55098: done with get_vars() 19110 1726882552.55114: done queuing things up, now waiting for results queue to drain 19110 1726882552.55116: results queue empty 19110 1726882552.55116: checking for any_errors_fatal 19110 1726882552.55119: done checking for any_errors_fatal 19110 1726882552.55120: checking for max_fail_percentage 19110 1726882552.55121: done checking for max_fail_percentage 19110 1726882552.55121: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.55122: done checking to see if all hosts have failed 19110 1726882552.55145: getting the remaining hosts for this loop 19110 1726882552.55146: done getting the remaining hosts for this loop 19110 1726882552.55150: getting the next task for host managed_node1 19110 1726882552.55154: done getting next task for host managed_node1 19110 1726882552.55156: ^ task is: TASK: meta (flush_handlers) 19110 1726882552.55157: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.55160: getting variables 19110 1726882552.55160: in VariableManager get_vars() 19110 1726882552.55170: Calling all_inventory to load vars for managed_node1 19110 1726882552.55172: Calling groups_inventory to load vars for managed_node1 19110 1726882552.55175: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.55179: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.55181: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.55187: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.55338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.55521: done with get_vars() 19110 1726882552.55529: done getting variables 19110 1726882552.55575: in VariableManager get_vars() 19110 1726882552.55583: Calling all_inventory to load vars for managed_node1 19110 1726882552.55585: Calling groups_inventory to load vars for managed_node1 19110 1726882552.55587: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.55591: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.55594: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.55597: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.55760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.55949: done with get_vars() 19110 1726882552.55960: done queuing things up, now waiting for results queue to drain 19110 1726882552.55962: results queue empty 19110 1726882552.55965: checking for any_errors_fatal 19110 1726882552.55966: done checking for any_errors_fatal 19110 1726882552.55967: checking for max_fail_percentage 19110 1726882552.55968: done checking for max_fail_percentage 19110 1726882552.55968: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.55969: done checking to see if all hosts have failed 19110 1726882552.55970: getting the remaining hosts for this loop 19110 1726882552.55971: done getting the remaining hosts for this loop 19110 1726882552.55973: getting the next task for host managed_node1 19110 1726882552.55975: done getting next task for host managed_node1 19110 1726882552.55976: ^ task is: None 19110 1726882552.55978: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.55979: done queuing things up, now waiting for results queue to drain 19110 1726882552.55980: results queue empty 19110 1726882552.55980: checking for any_errors_fatal 19110 1726882552.55981: done checking for any_errors_fatal 19110 1726882552.55982: checking for max_fail_percentage 19110 1726882552.55982: done checking for max_fail_percentage 19110 1726882552.55983: checking to see if all hosts have failed and the running result is not ok 19110 1726882552.55984: done checking to see if all hosts have failed 19110 1726882552.55985: getting the next task for host managed_node1 19110 1726882552.55987: done getting next task for host managed_node1 19110 1726882552.55988: ^ task is: None 19110 1726882552.55989: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.56032: in VariableManager get_vars() 19110 1726882552.56051: done with get_vars() 19110 1726882552.56057: in VariableManager get_vars() 19110 1726882552.56071: done with get_vars() 19110 1726882552.56076: variable 'omit' from source: magic vars 19110 1726882552.56104: in VariableManager get_vars() 19110 1726882552.56117: done with get_vars() 19110 1726882552.56137: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 19110 1726882552.56745: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882552.56771: getting the remaining hosts for this loop 19110 1726882552.56772: done getting the remaining hosts for this loop 19110 1726882552.56775: getting the next task for host managed_node1 19110 1726882552.56777: done getting next task for host managed_node1 19110 1726882552.56779: ^ task is: TASK: Gathering Facts 19110 1726882552.56780: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882552.56782: getting variables 19110 1726882552.56783: in VariableManager get_vars() 19110 1726882552.56793: Calling all_inventory to load vars for managed_node1 19110 1726882552.56796: Calling groups_inventory to load vars for managed_node1 19110 1726882552.56798: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882552.56802: Calling all_plugins_play to load vars for managed_node1 19110 1726882552.56805: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882552.56807: Calling groups_plugins_play to load vars for managed_node1 19110 1726882552.56942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882552.57168: done with get_vars() 19110 1726882552.57178: done getting variables 19110 1726882552.57218: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 21:35:52 -0400 (0:00:00.096) 0:00:09.429 ****** 19110 1726882552.57240: entering _queue_task() for managed_node1/gather_facts 19110 1726882552.57521: worker is 1 (out of 1 available) 19110 1726882552.57533: exiting _queue_task() for managed_node1/gather_facts 19110 1726882552.57543: done queuing things up, now waiting for results queue to drain 19110 1726882552.57545: waiting for pending results... 19110 1726882552.57795: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882552.57886: in run() - task 0e448fcc-3ce9-5372-c19a-000000000237 19110 1726882552.57908: variable 'ansible_search_path' from source: unknown 19110 1726882552.57949: calling self._execute() 19110 1726882552.58035: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.58048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.58063: variable 'omit' from source: magic vars 19110 1726882552.58514: variable 'ansible_distribution_major_version' from source: facts 19110 1726882552.58535: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882552.58545: variable 'omit' from source: magic vars 19110 1726882552.58576: variable 'omit' from source: magic vars 19110 1726882552.58614: variable 'omit' from source: magic vars 19110 1726882552.58661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882552.58705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882552.58731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882552.58758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.58777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882552.58810: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882552.58818: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.58826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.58928: Set connection var ansible_timeout to 10 19110 1726882552.58946: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882552.58957: Set connection var ansible_shell_executable to /bin/sh 19110 1726882552.58970: Set connection var ansible_shell_type to sh 19110 1726882552.58978: Set connection var ansible_connection to ssh 19110 1726882552.58987: Set connection var ansible_pipelining to False 19110 1726882552.59013: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.59021: variable 'ansible_connection' from source: unknown 19110 1726882552.59028: variable 'ansible_module_compression' from source: unknown 19110 1726882552.59034: variable 'ansible_shell_type' from source: unknown 19110 1726882552.59043: variable 'ansible_shell_executable' from source: unknown 19110 1726882552.59050: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882552.59057: variable 'ansible_pipelining' from source: unknown 19110 1726882552.59066: variable 'ansible_timeout' from source: unknown 19110 1726882552.59077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882552.59249: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882552.59266: variable 'omit' from source: magic vars 19110 1726882552.59277: starting attempt loop 19110 1726882552.59285: running the handler 19110 1726882552.59305: variable 'ansible_facts' from source: unknown 19110 1726882552.59326: _low_level_execute_command(): starting 19110 1726882552.59339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882552.60216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.60232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.60248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.60273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.60315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.60328: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.60344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.60366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.60382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.60393: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.60406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.60419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.60434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.60446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.60459: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.60476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.60554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.60580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.60600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.60733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.62405: stdout chunk (state=3): >>>/root <<< 19110 1726882552.62590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.62594: stdout chunk (state=3): >>><<< 19110 1726882552.62596: stderr chunk (state=3): >>><<< 19110 1726882552.62712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.62716: _low_level_execute_command(): starting 19110 1726882552.62721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662 `" && echo ansible-tmp-1726882552.6261733-19569-269983446095662="` echo /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662 `" ) && sleep 0' 19110 1726882552.64090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.64093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.64209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882552.64212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.64216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.64402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.64406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.64409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.64518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.66382: stdout chunk (state=3): >>>ansible-tmp-1726882552.6261733-19569-269983446095662=/root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662 <<< 19110 1726882552.66494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.66570: stderr chunk (state=3): >>><<< 19110 1726882552.66573: stdout chunk (state=3): >>><<< 19110 1726882552.66871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882552.6261733-19569-269983446095662=/root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.66874: variable 'ansible_module_compression' from source: unknown 19110 1726882552.66877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882552.66879: variable 'ansible_facts' from source: unknown 19110 1726882552.66925: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/AnsiballZ_setup.py 19110 1726882552.67588: Sending initial data 19110 1726882552.67591: Sent initial data (154 bytes) 19110 1726882552.70014: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.70034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.70051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.70073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.70117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.70151: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.70169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.70187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.70261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.70276: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.70287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.70301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.70318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.70332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.70344: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.70362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.70440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.70592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.70611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.70736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.72498: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882552.72590: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882552.72692: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpp2j1acki /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/AnsiballZ_setup.py <<< 19110 1726882552.72786: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882552.76087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.76271: stderr chunk (state=3): >>><<< 19110 1726882552.76275: stdout chunk (state=3): >>><<< 19110 1726882552.76378: done transferring module to remote 19110 1726882552.76382: _low_level_execute_command(): starting 19110 1726882552.76384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/ /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/AnsiballZ_setup.py && sleep 0' 19110 1726882552.77802: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.77883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.77901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.77929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.77975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.78032: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.78048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.78069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.78083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.78095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.78108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.78122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.78142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.78155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.78169: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.78183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.78376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.78393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.78408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.78578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882552.80430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882552.80434: stdout chunk (state=3): >>><<< 19110 1726882552.80436: stderr chunk (state=3): >>><<< 19110 1726882552.80469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882552.80472: _low_level_execute_command(): starting 19110 1726882552.80474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/AnsiballZ_setup.py && sleep 0' 19110 1726882552.82139: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882552.82292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.82308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.82328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.82374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.82391: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882552.82406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.82424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882552.82435: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882552.82446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882552.82459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882552.82476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882552.82492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882552.82508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882552.82519: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882552.82532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882552.82613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882552.82735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882552.82750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882552.82950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882553.36043: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "53", "epoch": "1726882553", "epoch_int": "1726882553", "date": "2024-09-20", "time": "21:35:53", "iso8601_micro": "2024-09-21T01:35:53.051468Z", "iso8601": "2024-09-21T01:35:53Z", "iso8601_basic": "20240920T213553051468", "iso8601_basic_short": "20240920T213553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46, "5m": 0.4, "15m": 0.21}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2772, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 760, "free": 2772}, "nocache": {"free": 3234, "used": 298}, "swap": {"total": 0, "free": 0, "used": 0, "cac<<< 19110 1726882553.36079: stdout chunk (state=3): >>>hed": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239312896, "block_size": 4096, "block_total": 65519355, "block_available": 64511551, "block_used": 1007804, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b9de:9c61:daf5:5f43", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path<<< 19110 1726882553.36104: stdout chunk (state=3): >>>=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882553.37681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882553.37744: stderr chunk (state=3): >>><<< 19110 1726882553.37747: stdout chunk (state=3): >>><<< 19110 1726882553.38074: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "53", "epoch": "1726882553", "epoch_int": "1726882553", "date": "2024-09-20", "time": "21:35:53", "iso8601_micro": "2024-09-21T01:35:53.051468Z", "iso8601": "2024-09-21T01:35:53Z", "iso8601_basic": "20240920T213553051468", "iso8601_basic_short": "20240920T213553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.46, "5m": 0.4, "15m": 0.21}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2772, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 760, "free": 2772}, "nocache": {"free": 3234, "used": 298}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239312896, "block_size": 4096, "block_total": 65519355, "block_available": 64511551, "block_used": 1007804, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b9de:9c61:daf5:5f43", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882553.38267: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882553.38297: _low_level_execute_command(): starting 19110 1726882553.38304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882552.6261733-19569-269983446095662/ > /dev/null 2>&1 && sleep 0' 19110 1726882553.39975: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882553.39989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.40003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.40019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.40069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.40082: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882553.40096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.40113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882553.40124: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882553.40136: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882553.40153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.40168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.40183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.40265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.40278: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882553.40291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.40481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882553.40503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882553.40519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882553.40643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882553.42495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882553.42545: stderr chunk (state=3): >>><<< 19110 1726882553.42549: stdout chunk (state=3): >>><<< 19110 1726882553.42674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882553.42677: handler run complete 19110 1726882553.42872: variable 'ansible_facts' from source: unknown 19110 1726882553.42987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.43403: variable 'ansible_facts' from source: unknown 19110 1726882553.43510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.43679: attempt loop complete, returning result 19110 1726882553.43690: _execute() done 19110 1726882553.43697: dumping result to json 19110 1726882553.43740: done dumping result, returning 19110 1726882553.43753: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-000000000237] 19110 1726882553.43766: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000237 ok: [managed_node1] 19110 1726882553.44503: no more pending results, returning what we have 19110 1726882553.44506: results queue empty 19110 1726882553.44507: checking for any_errors_fatal 19110 1726882553.44508: done checking for any_errors_fatal 19110 1726882553.44509: checking for max_fail_percentage 19110 1726882553.44510: done checking for max_fail_percentage 19110 1726882553.44511: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.44512: done checking to see if all hosts have failed 19110 1726882553.44513: getting the remaining hosts for this loop 19110 1726882553.44514: done getting the remaining hosts for this loop 19110 1726882553.44517: getting the next task for host managed_node1 19110 1726882553.44522: done getting next task for host managed_node1 19110 1726882553.44524: ^ task is: TASK: meta (flush_handlers) 19110 1726882553.44526: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.44529: getting variables 19110 1726882553.44531: in VariableManager get_vars() 19110 1726882553.44560: Calling all_inventory to load vars for managed_node1 19110 1726882553.44578: Calling groups_inventory to load vars for managed_node1 19110 1726882553.44587: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.44599: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.44601: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.44604: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.44856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.45307: done with get_vars() 19110 1726882553.45318: done getting variables 19110 1726882553.45462: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000237 19110 1726882553.45468: WORKER PROCESS EXITING 19110 1726882553.45517: in VariableManager get_vars() 19110 1726882553.45529: Calling all_inventory to load vars for managed_node1 19110 1726882553.45532: Calling groups_inventory to load vars for managed_node1 19110 1726882553.45534: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.45539: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.45541: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.45547: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.45888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.46321: done with get_vars() 19110 1726882553.46334: done queuing things up, now waiting for results queue to drain 19110 1726882553.46336: results queue empty 19110 1726882553.46337: checking for any_errors_fatal 19110 1726882553.46368: done checking for any_errors_fatal 19110 1726882553.46370: checking for max_fail_percentage 19110 1726882553.46371: done checking for max_fail_percentage 19110 1726882553.46372: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.46373: done checking to see if all hosts have failed 19110 1726882553.46373: getting the remaining hosts for this loop 19110 1726882553.46374: done getting the remaining hosts for this loop 19110 1726882553.46377: getting the next task for host managed_node1 19110 1726882553.46381: done getting next task for host managed_node1 19110 1726882553.46383: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882553.46385: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.46395: getting variables 19110 1726882553.46396: in VariableManager get_vars() 19110 1726882553.46409: Calling all_inventory to load vars for managed_node1 19110 1726882553.46411: Calling groups_inventory to load vars for managed_node1 19110 1726882553.46413: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.46417: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.46420: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.46423: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.46877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.47157: done with get_vars() 19110 1726882553.47168: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:53 -0400 (0:00:00.900) 0:00:10.329 ****** 19110 1726882553.47247: entering _queue_task() for managed_node1/include_tasks 19110 1726882553.47524: worker is 1 (out of 1 available) 19110 1726882553.47535: exiting _queue_task() for managed_node1/include_tasks 19110 1726882553.47547: done queuing things up, now waiting for results queue to drain 19110 1726882553.47548: waiting for pending results... 19110 1726882553.47903: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882553.48010: in run() - task 0e448fcc-3ce9-5372-c19a-000000000019 19110 1726882553.48030: variable 'ansible_search_path' from source: unknown 19110 1726882553.48038: variable 'ansible_search_path' from source: unknown 19110 1726882553.48079: calling self._execute() 19110 1726882553.48160: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.48173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.48188: variable 'omit' from source: magic vars 19110 1726882553.48552: variable 'ansible_distribution_major_version' from source: facts 19110 1726882553.48573: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882553.48585: _execute() done 19110 1726882553.48593: dumping result to json 19110 1726882553.48601: done dumping result, returning 19110 1726882553.48612: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-5372-c19a-000000000019] 19110 1726882553.48623: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000019 19110 1726882553.48762: no more pending results, returning what we have 19110 1726882553.48769: in VariableManager get_vars() 19110 1726882553.48808: Calling all_inventory to load vars for managed_node1 19110 1726882553.48811: Calling groups_inventory to load vars for managed_node1 19110 1726882553.48813: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.48824: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.48826: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.48829: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.49018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.49289: done with get_vars() 19110 1726882553.49296: variable 'ansible_search_path' from source: unknown 19110 1726882553.49298: variable 'ansible_search_path' from source: unknown 19110 1726882553.49323: we have included files to process 19110 1726882553.49324: generating all_blocks data 19110 1726882553.49326: done generating all_blocks data 19110 1726882553.49326: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882553.49327: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882553.49329: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882553.49525: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000019 19110 1726882553.49528: WORKER PROCESS EXITING 19110 1726882553.50026: done processing included file 19110 1726882553.50029: iterating over new_blocks loaded from include file 19110 1726882553.50030: in VariableManager get_vars() 19110 1726882553.50051: done with get_vars() 19110 1726882553.50053: filtering new block on tags 19110 1726882553.50074: done filtering new block on tags 19110 1726882553.50077: in VariableManager get_vars() 19110 1726882553.50095: done with get_vars() 19110 1726882553.50097: filtering new block on tags 19110 1726882553.50115: done filtering new block on tags 19110 1726882553.50117: in VariableManager get_vars() 19110 1726882553.50136: done with get_vars() 19110 1726882553.50137: filtering new block on tags 19110 1726882553.50152: done filtering new block on tags 19110 1726882553.50157: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 19110 1726882553.50162: extending task lists for all hosts with included blocks 19110 1726882553.50547: done extending task lists 19110 1726882553.50549: done processing included files 19110 1726882553.50550: results queue empty 19110 1726882553.50550: checking for any_errors_fatal 19110 1726882553.50552: done checking for any_errors_fatal 19110 1726882553.50552: checking for max_fail_percentage 19110 1726882553.50556: done checking for max_fail_percentage 19110 1726882553.50557: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.50558: done checking to see if all hosts have failed 19110 1726882553.50559: getting the remaining hosts for this loop 19110 1726882553.50560: done getting the remaining hosts for this loop 19110 1726882553.50563: getting the next task for host managed_node1 19110 1726882553.50568: done getting next task for host managed_node1 19110 1726882553.50570: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882553.50572: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.50581: getting variables 19110 1726882553.50582: in VariableManager get_vars() 19110 1726882553.50595: Calling all_inventory to load vars for managed_node1 19110 1726882553.50598: Calling groups_inventory to load vars for managed_node1 19110 1726882553.50600: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.50604: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.50607: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.50610: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.50792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.51031: done with get_vars() 19110 1726882553.51040: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:53 -0400 (0:00:00.038) 0:00:10.368 ****** 19110 1726882553.51111: entering _queue_task() for managed_node1/setup 19110 1726882553.51387: worker is 1 (out of 1 available) 19110 1726882553.51400: exiting _queue_task() for managed_node1/setup 19110 1726882553.51409: done queuing things up, now waiting for results queue to drain 19110 1726882553.51411: waiting for pending results... 19110 1726882553.52135: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882553.52248: in run() - task 0e448fcc-3ce9-5372-c19a-000000000279 19110 1726882553.52275: variable 'ansible_search_path' from source: unknown 19110 1726882553.52284: variable 'ansible_search_path' from source: unknown 19110 1726882553.52331: calling self._execute() 19110 1726882553.52532: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.52544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.52560: variable 'omit' from source: magic vars 19110 1726882553.53337: variable 'ansible_distribution_major_version' from source: facts 19110 1726882553.53359: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882553.53639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882553.56381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882553.56452: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882553.56497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882553.56535: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882553.56578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882553.56674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882553.56709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882553.56742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882553.56799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882553.56819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882553.56884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882553.56913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882553.56944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882553.57000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882553.57020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882553.57190: variable '__network_required_facts' from source: role '' defaults 19110 1726882553.57209: variable 'ansible_facts' from source: unknown 19110 1726882553.57320: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19110 1726882553.57328: when evaluation is False, skipping this task 19110 1726882553.57336: _execute() done 19110 1726882553.57341: dumping result to json 19110 1726882553.57348: done dumping result, returning 19110 1726882553.57365: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-5372-c19a-000000000279] 19110 1726882553.57376: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000279 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882553.57531: no more pending results, returning what we have 19110 1726882553.57535: results queue empty 19110 1726882553.57536: checking for any_errors_fatal 19110 1726882553.57537: done checking for any_errors_fatal 19110 1726882553.57538: checking for max_fail_percentage 19110 1726882553.57540: done checking for max_fail_percentage 19110 1726882553.57540: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.57541: done checking to see if all hosts have failed 19110 1726882553.57542: getting the remaining hosts for this loop 19110 1726882553.57544: done getting the remaining hosts for this loop 19110 1726882553.57548: getting the next task for host managed_node1 19110 1726882553.57560: done getting next task for host managed_node1 19110 1726882553.57566: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882553.57570: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.57584: getting variables 19110 1726882553.57587: in VariableManager get_vars() 19110 1726882553.57631: Calling all_inventory to load vars for managed_node1 19110 1726882553.57634: Calling groups_inventory to load vars for managed_node1 19110 1726882553.57637: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.57649: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.57652: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.57657: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.57846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.58708: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000279 19110 1726882553.58711: WORKER PROCESS EXITING 19110 1726882553.58788: done with get_vars() 19110 1726882553.58798: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:53 -0400 (0:00:00.077) 0:00:10.446 ****** 19110 1726882553.58893: entering _queue_task() for managed_node1/stat 19110 1726882553.59145: worker is 1 (out of 1 available) 19110 1726882553.59160: exiting _queue_task() for managed_node1/stat 19110 1726882553.59173: done queuing things up, now waiting for results queue to drain 19110 1726882553.59175: waiting for pending results... 19110 1726882553.59423: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882553.59557: in run() - task 0e448fcc-3ce9-5372-c19a-00000000027b 19110 1726882553.59583: variable 'ansible_search_path' from source: unknown 19110 1726882553.59591: variable 'ansible_search_path' from source: unknown 19110 1726882553.59635: calling self._execute() 19110 1726882553.59715: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.59730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.59742: variable 'omit' from source: magic vars 19110 1726882553.60106: variable 'ansible_distribution_major_version' from source: facts 19110 1726882553.60122: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882553.60301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882553.60556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882553.60613: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882553.60650: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882553.60693: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882553.60786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882553.60822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882553.60861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882553.60895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882553.60998: variable '__network_is_ostree' from source: set_fact 19110 1726882553.61010: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882553.61016: when evaluation is False, skipping this task 19110 1726882553.61023: _execute() done 19110 1726882553.61033: dumping result to json 19110 1726882553.61041: done dumping result, returning 19110 1726882553.61051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-5372-c19a-00000000027b] 19110 1726882553.61061: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027b skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882553.61211: no more pending results, returning what we have 19110 1726882553.61215: results queue empty 19110 1726882553.61216: checking for any_errors_fatal 19110 1726882553.61223: done checking for any_errors_fatal 19110 1726882553.61224: checking for max_fail_percentage 19110 1726882553.61225: done checking for max_fail_percentage 19110 1726882553.61226: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.61227: done checking to see if all hosts have failed 19110 1726882553.61227: getting the remaining hosts for this loop 19110 1726882553.61229: done getting the remaining hosts for this loop 19110 1726882553.61236: getting the next task for host managed_node1 19110 1726882553.61242: done getting next task for host managed_node1 19110 1726882553.61246: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882553.61252: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.61267: getting variables 19110 1726882553.61269: in VariableManager get_vars() 19110 1726882553.61305: Calling all_inventory to load vars for managed_node1 19110 1726882553.61309: Calling groups_inventory to load vars for managed_node1 19110 1726882553.61311: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.61321: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.61324: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.61327: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.61544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.62131: done with get_vars() 19110 1726882553.62143: done getting variables 19110 1726882553.62334: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027b 19110 1726882553.62338: WORKER PROCESS EXITING 19110 1726882553.62376: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:53 -0400 (0:00:00.035) 0:00:10.481 ****** 19110 1726882553.62407: entering _queue_task() for managed_node1/set_fact 19110 1726882553.62628: worker is 1 (out of 1 available) 19110 1726882553.62640: exiting _queue_task() for managed_node1/set_fact 19110 1726882553.62650: done queuing things up, now waiting for results queue to drain 19110 1726882553.62652: waiting for pending results... 19110 1726882553.63360: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882553.63609: in run() - task 0e448fcc-3ce9-5372-c19a-00000000027c 19110 1726882553.63628: variable 'ansible_search_path' from source: unknown 19110 1726882553.63634: variable 'ansible_search_path' from source: unknown 19110 1726882553.63690: calling self._execute() 19110 1726882553.63784: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.63883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.63933: variable 'omit' from source: magic vars 19110 1726882553.64913: variable 'ansible_distribution_major_version' from source: facts 19110 1726882553.64931: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882553.65147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882553.65573: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882553.65621: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882553.65667: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882553.65707: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882553.65794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882553.65822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882553.65852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882553.65891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882553.65980: variable '__network_is_ostree' from source: set_fact 19110 1726882553.65996: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882553.66002: when evaluation is False, skipping this task 19110 1726882553.66008: _execute() done 19110 1726882553.66013: dumping result to json 19110 1726882553.66020: done dumping result, returning 19110 1726882553.66030: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-5372-c19a-00000000027c] 19110 1726882553.66039: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882553.66177: no more pending results, returning what we have 19110 1726882553.66181: results queue empty 19110 1726882553.66182: checking for any_errors_fatal 19110 1726882553.66190: done checking for any_errors_fatal 19110 1726882553.66191: checking for max_fail_percentage 19110 1726882553.66192: done checking for max_fail_percentage 19110 1726882553.66194: checking to see if all hosts have failed and the running result is not ok 19110 1726882553.66195: done checking to see if all hosts have failed 19110 1726882553.66195: getting the remaining hosts for this loop 19110 1726882553.66197: done getting the remaining hosts for this loop 19110 1726882553.66201: getting the next task for host managed_node1 19110 1726882553.66209: done getting next task for host managed_node1 19110 1726882553.66213: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882553.66216: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882553.66229: getting variables 19110 1726882553.66231: in VariableManager get_vars() 19110 1726882553.66271: Calling all_inventory to load vars for managed_node1 19110 1726882553.66274: Calling groups_inventory to load vars for managed_node1 19110 1726882553.66277: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882553.66288: Calling all_plugins_play to load vars for managed_node1 19110 1726882553.66290: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882553.66293: Calling groups_plugins_play to load vars for managed_node1 19110 1726882553.66878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882553.67326: done with get_vars() 19110 1726882553.67337: done getting variables 19110 1726882553.67370: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027c 19110 1726882553.67373: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:53 -0400 (0:00:00.050) 0:00:10.531 ****** 19110 1726882553.67437: entering _queue_task() for managed_node1/service_facts 19110 1726882553.67439: Creating lock for service_facts 19110 1726882553.68528: worker is 1 (out of 1 available) 19110 1726882553.68540: exiting _queue_task() for managed_node1/service_facts 19110 1726882553.68551: done queuing things up, now waiting for results queue to drain 19110 1726882553.68553: waiting for pending results... 19110 1726882553.68892: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882553.69023: in run() - task 0e448fcc-3ce9-5372-c19a-00000000027e 19110 1726882553.69044: variable 'ansible_search_path' from source: unknown 19110 1726882553.69049: variable 'ansible_search_path' from source: unknown 19110 1726882553.69089: calling self._execute() 19110 1726882553.69176: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.69187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.69200: variable 'omit' from source: magic vars 19110 1726882553.69561: variable 'ansible_distribution_major_version' from source: facts 19110 1726882553.69581: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882553.69591: variable 'omit' from source: magic vars 19110 1726882553.69645: variable 'omit' from source: magic vars 19110 1726882553.69688: variable 'omit' from source: magic vars 19110 1726882553.69732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882553.69775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882553.69799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882553.69821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882553.69838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882553.69876: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882553.69885: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.69892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.69995: Set connection var ansible_timeout to 10 19110 1726882553.70013: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882553.70022: Set connection var ansible_shell_executable to /bin/sh 19110 1726882553.70028: Set connection var ansible_shell_type to sh 19110 1726882553.70033: Set connection var ansible_connection to ssh 19110 1726882553.70041: Set connection var ansible_pipelining to False 19110 1726882553.70069: variable 'ansible_shell_executable' from source: unknown 19110 1726882553.70077: variable 'ansible_connection' from source: unknown 19110 1726882553.70088: variable 'ansible_module_compression' from source: unknown 19110 1726882553.70095: variable 'ansible_shell_type' from source: unknown 19110 1726882553.70101: variable 'ansible_shell_executable' from source: unknown 19110 1726882553.70107: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882553.70114: variable 'ansible_pipelining' from source: unknown 19110 1726882553.70119: variable 'ansible_timeout' from source: unknown 19110 1726882553.70126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882553.70319: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882553.70334: variable 'omit' from source: magic vars 19110 1726882553.70344: starting attempt loop 19110 1726882553.70350: running the handler 19110 1726882553.70370: _low_level_execute_command(): starting 19110 1726882553.70383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882553.71199: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882553.71218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.71241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.71270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.71329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.71347: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882553.71364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.71395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882553.71408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882553.71426: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882553.71448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.71467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.71490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.71507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.71518: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882553.71532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.71630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882553.71661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882553.71686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882553.71837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882553.73529: stdout chunk (state=3): >>>/root <<< 19110 1726882553.73660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882553.73777: stderr chunk (state=3): >>><<< 19110 1726882553.73789: stdout chunk (state=3): >>><<< 19110 1726882553.73843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882553.73870: _low_level_execute_command(): starting 19110 1726882553.73938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552 `" && echo ansible-tmp-1726882553.7382684-19629-167334399602552="` echo /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552 `" ) && sleep 0' 19110 1726882553.76406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882553.76546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.76566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.76584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.76624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.76637: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882553.76655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.76678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882553.76712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882553.76725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882553.76736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882553.76750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882553.76773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882553.76877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882553.76891: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882553.76903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882553.76984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882553.77008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882553.77023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882553.77145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882553.79044: stdout chunk (state=3): >>>ansible-tmp-1726882553.7382684-19629-167334399602552=/root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552 <<< 19110 1726882553.79234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882553.79268: stderr chunk (state=3): >>><<< 19110 1726882553.79279: stdout chunk (state=3): >>><<< 19110 1726882553.79365: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882553.7382684-19629-167334399602552=/root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882553.79368: variable 'ansible_module_compression' from source: unknown 19110 1726882553.79448: ANSIBALLZ: Using lock for service_facts 19110 1726882553.79451: ANSIBALLZ: Acquiring lock 19110 1726882553.79453: ANSIBALLZ: Lock acquired: 139855631806544 19110 1726882553.79456: ANSIBALLZ: Creating module 19110 1726882554.03026: ANSIBALLZ: Writing module into payload 19110 1726882554.03449: ANSIBALLZ: Writing module 19110 1726882554.03478: ANSIBALLZ: Renaming module 19110 1726882554.03482: ANSIBALLZ: Done creating module 19110 1726882554.03501: variable 'ansible_facts' from source: unknown 19110 1726882554.03578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/AnsiballZ_service_facts.py 19110 1726882554.05012: Sending initial data 19110 1726882554.05015: Sent initial data (162 bytes) 19110 1726882554.07998: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882554.08010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.08018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882554.08031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882554.08072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882554.08079: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882554.08089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882554.08103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882554.08112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882554.08120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882554.08125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.08134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882554.08146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882554.08153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882554.08159: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882554.08281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882554.08349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882554.08366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882554.08369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882554.08589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882554.10419: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 19110 1726882554.10424: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882554.10509: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882554.10600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpko_py50v /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/AnsiballZ_service_facts.py <<< 19110 1726882554.10693: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882554.12286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882554.12375: stderr chunk (state=3): >>><<< 19110 1726882554.12379: stdout chunk (state=3): >>><<< 19110 1726882554.12397: done transferring module to remote 19110 1726882554.12409: _low_level_execute_command(): starting 19110 1726882554.12415: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/ /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/AnsiballZ_service_facts.py && sleep 0' 19110 1726882554.14512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882554.14521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.14532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882554.14547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882554.14588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882554.14779: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882554.14789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882554.14802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882554.14980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882554.14987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882554.14995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.15006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882554.15017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882554.15025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882554.15031: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882554.15041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882554.15114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882554.15133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882554.15145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882554.15382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882554.17080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882554.17119: stderr chunk (state=3): >>><<< 19110 1726882554.17122: stdout chunk (state=3): >>><<< 19110 1726882554.17139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882554.17142: _low_level_execute_command(): starting 19110 1726882554.17148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/AnsiballZ_service_facts.py && sleep 0' 19110 1726882554.18941: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.18945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882554.18993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882554.18997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882554.19073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882554.19079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882554.19085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882554.19227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882554.19292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882554.19297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882554.19425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882555.52422: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 19110 1726882555.52469: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 19110 1726882555.52476: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 19110 1726882555.52479: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 19110 1726882555.52483: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 19110 1726882555.52486: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19110 1726882555.53770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882555.53818: stderr chunk (state=3): >>><<< 19110 1726882555.53821: stdout chunk (state=3): >>><<< 19110 1726882555.53845: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882555.54727: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882555.54737: _low_level_execute_command(): starting 19110 1726882555.54742: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882553.7382684-19629-167334399602552/ > /dev/null 2>&1 && sleep 0' 19110 1726882555.56546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882555.56551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882555.56593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882555.56598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882555.56744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882555.56750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882555.56756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882555.56831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882555.56881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882555.56884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882555.57001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882555.58889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882555.58892: stderr chunk (state=3): >>><<< 19110 1726882555.58895: stdout chunk (state=3): >>><<< 19110 1726882555.58916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882555.58924: handler run complete 19110 1726882555.59110: variable 'ansible_facts' from source: unknown 19110 1726882555.59252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882555.59976: variable 'ansible_facts' from source: unknown 19110 1726882555.60207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882555.60746: attempt loop complete, returning result 19110 1726882555.60749: _execute() done 19110 1726882555.60752: dumping result to json 19110 1726882555.61571: done dumping result, returning 19110 1726882555.61582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-5372-c19a-00000000027e] 19110 1726882555.61587: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027e 19110 1726882555.63218: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027e 19110 1726882555.63221: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882555.63279: no more pending results, returning what we have 19110 1726882555.63282: results queue empty 19110 1726882555.63283: checking for any_errors_fatal 19110 1726882555.63291: done checking for any_errors_fatal 19110 1726882555.63292: checking for max_fail_percentage 19110 1726882555.63293: done checking for max_fail_percentage 19110 1726882555.63294: checking to see if all hosts have failed and the running result is not ok 19110 1726882555.63295: done checking to see if all hosts have failed 19110 1726882555.63296: getting the remaining hosts for this loop 19110 1726882555.63298: done getting the remaining hosts for this loop 19110 1726882555.63301: getting the next task for host managed_node1 19110 1726882555.63307: done getting next task for host managed_node1 19110 1726882555.63310: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882555.63313: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882555.63323: getting variables 19110 1726882555.63325: in VariableManager get_vars() 19110 1726882555.63360: Calling all_inventory to load vars for managed_node1 19110 1726882555.63365: Calling groups_inventory to load vars for managed_node1 19110 1726882555.63367: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882555.63377: Calling all_plugins_play to load vars for managed_node1 19110 1726882555.63380: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882555.63382: Calling groups_plugins_play to load vars for managed_node1 19110 1726882555.63809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882555.64409: done with get_vars() 19110 1726882555.64580: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:55 -0400 (0:00:01.973) 0:00:12.505 ****** 19110 1726882555.64811: entering _queue_task() for managed_node1/package_facts 19110 1726882555.64813: Creating lock for package_facts 19110 1726882555.65734: worker is 1 (out of 1 available) 19110 1726882555.65972: exiting _queue_task() for managed_node1/package_facts 19110 1726882555.65992: done queuing things up, now waiting for results queue to drain 19110 1726882555.65994: waiting for pending results... 19110 1726882555.66193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882555.66321: in run() - task 0e448fcc-3ce9-5372-c19a-00000000027f 19110 1726882555.66340: variable 'ansible_search_path' from source: unknown 19110 1726882555.66347: variable 'ansible_search_path' from source: unknown 19110 1726882555.66391: calling self._execute() 19110 1726882555.66474: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882555.66485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882555.66503: variable 'omit' from source: magic vars 19110 1726882555.67190: variable 'ansible_distribution_major_version' from source: facts 19110 1726882555.67275: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882555.67286: variable 'omit' from source: magic vars 19110 1726882555.67341: variable 'omit' from source: magic vars 19110 1726882555.67505: variable 'omit' from source: magic vars 19110 1726882555.67549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882555.67644: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882555.67710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882555.67765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882555.68518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882555.68705: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882555.68714: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882555.68727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882555.68829: Set connection var ansible_timeout to 10 19110 1726882555.68912: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882555.68923: Set connection var ansible_shell_executable to /bin/sh 19110 1726882555.68929: Set connection var ansible_shell_type to sh 19110 1726882555.68936: Set connection var ansible_connection to ssh 19110 1726882555.68949: Set connection var ansible_pipelining to False 19110 1726882555.69010: variable 'ansible_shell_executable' from source: unknown 19110 1726882555.69018: variable 'ansible_connection' from source: unknown 19110 1726882555.69025: variable 'ansible_module_compression' from source: unknown 19110 1726882555.69032: variable 'ansible_shell_type' from source: unknown 19110 1726882555.69038: variable 'ansible_shell_executable' from source: unknown 19110 1726882555.69045: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882555.69056: variable 'ansible_pipelining' from source: unknown 19110 1726882555.69065: variable 'ansible_timeout' from source: unknown 19110 1726882555.69073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882555.69281: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882555.69296: variable 'omit' from source: magic vars 19110 1726882555.69304: starting attempt loop 19110 1726882555.69310: running the handler 19110 1726882555.69325: _low_level_execute_command(): starting 19110 1726882555.69336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882555.70049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882555.70070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882555.70087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882555.70105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882555.70150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882555.70168: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882555.70183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882555.70202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882555.70214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882555.70225: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882555.70240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882555.70260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882555.70281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882555.70293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882555.70303: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882555.70316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882555.70396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882555.70419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882555.70435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882555.70567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882555.72243: stdout chunk (state=3): >>>/root <<< 19110 1726882555.72388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882555.72446: stderr chunk (state=3): >>><<< 19110 1726882555.72449: stdout chunk (state=3): >>><<< 19110 1726882555.72820: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882555.72824: _low_level_execute_command(): starting 19110 1726882555.72828: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392 `" && echo ansible-tmp-1726882555.7247317-19703-131206585073392="` echo /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392 `" ) && sleep 0' 19110 1726882555.74533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882555.74549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882555.74570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882555.74592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882555.74631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882555.74642: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882555.74657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882555.74678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882555.74693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882555.74704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882555.74717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882555.74730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882555.74744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882555.74757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882555.74771: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882555.74784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882555.74862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882555.74884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882555.74901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882555.75029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882555.76892: stdout chunk (state=3): >>>ansible-tmp-1726882555.7247317-19703-131206585073392=/root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392 <<< 19110 1726882555.77071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882555.77074: stdout chunk (state=3): >>><<< 19110 1726882555.77088: stderr chunk (state=3): >>><<< 19110 1726882555.77170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882555.7247317-19703-131206585073392=/root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882555.77175: variable 'ansible_module_compression' from source: unknown 19110 1726882555.77269: ANSIBALLZ: Using lock for package_facts 19110 1726882555.77273: ANSIBALLZ: Acquiring lock 19110 1726882555.77275: ANSIBALLZ: Lock acquired: 139855630204704 19110 1726882555.77277: ANSIBALLZ: Creating module 19110 1726882556.59490: ANSIBALLZ: Writing module into payload 19110 1726882556.59912: ANSIBALLZ: Writing module 19110 1726882556.59942: ANSIBALLZ: Renaming module 19110 1726882556.59945: ANSIBALLZ: Done creating module 19110 1726882556.60101: variable 'ansible_facts' from source: unknown 19110 1726882556.60519: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/AnsiballZ_package_facts.py 19110 1726882556.61962: Sending initial data 19110 1726882556.61968: Sent initial data (162 bytes) 19110 1726882556.65485: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882556.65673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.65676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882556.65678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.65680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882556.65758: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882556.65914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.65917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882556.65919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882556.65921: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882556.65922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.65924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882556.65926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.65927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882556.65929: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882556.65931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.65932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882556.65979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882556.65989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882556.66191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882556.68031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882556.68126: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882556.68218: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp9wg81t7v /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/AnsiballZ_package_facts.py <<< 19110 1726882556.68310: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882556.71483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882556.71607: stderr chunk (state=3): >>><<< 19110 1726882556.71611: stdout chunk (state=3): >>><<< 19110 1726882556.71637: done transferring module to remote 19110 1726882556.71648: _low_level_execute_command(): starting 19110 1726882556.71653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/ /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/AnsiballZ_package_facts.py && sleep 0' 19110 1726882556.72548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.72554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.72593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.72598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.72621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882556.72627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.72633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882556.72645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.72717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882556.72741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882556.72744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882556.72865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882556.74686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882556.74692: stderr chunk (state=3): >>><<< 19110 1726882556.74694: stdout chunk (state=3): >>><<< 19110 1726882556.74714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882556.74717: _low_level_execute_command(): starting 19110 1726882556.74724: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/AnsiballZ_package_facts.py && sleep 0' 19110 1726882556.76750: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882556.76771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.76786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882556.76830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.76884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882556.76921: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882556.76929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.76945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882556.76968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882556.76972: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882556.76981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882556.77017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882556.77028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882556.77045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882556.77060: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882556.77072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882556.77185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882556.77202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882556.77213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882556.77354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882557.23037: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 19110 1726882557.23059: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 19110 1726882557.23092: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19110 1726882557.24580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882557.24584: stdout chunk (state=3): >>><<< 19110 1726882557.24586: stderr chunk (state=3): >>><<< 19110 1726882557.24634: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882557.29366: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882557.29388: _low_level_execute_command(): starting 19110 1726882557.29391: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882555.7247317-19703-131206585073392/ > /dev/null 2>&1 && sleep 0' 19110 1726882557.31376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882557.31392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882557.31403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882557.31417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882557.31462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882557.31472: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882557.31484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882557.31500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882557.31508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882557.31514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882557.31522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882557.31531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882557.31609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882557.31617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882557.31624: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882557.31633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882557.31823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882557.31843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882557.31856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882557.31986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882557.33885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882557.33901: stderr chunk (state=3): >>><<< 19110 1726882557.33904: stdout chunk (state=3): >>><<< 19110 1726882557.33920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882557.33926: handler run complete 19110 1726882557.35386: variable 'ansible_facts' from source: unknown 19110 1726882557.36574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.51108: variable 'ansible_facts' from source: unknown 19110 1726882557.53047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.55533: attempt loop complete, returning result 19110 1726882557.55603: _execute() done 19110 1726882557.55798: dumping result to json 19110 1726882557.56170: done dumping result, returning 19110 1726882557.56183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-5372-c19a-00000000027f] 19110 1726882557.56191: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027f 19110 1726882557.59659: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000027f 19110 1726882557.59663: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882557.59770: no more pending results, returning what we have 19110 1726882557.59773: results queue empty 19110 1726882557.59774: checking for any_errors_fatal 19110 1726882557.59778: done checking for any_errors_fatal 19110 1726882557.59779: checking for max_fail_percentage 19110 1726882557.59780: done checking for max_fail_percentage 19110 1726882557.59781: checking to see if all hosts have failed and the running result is not ok 19110 1726882557.59782: done checking to see if all hosts have failed 19110 1726882557.59782: getting the remaining hosts for this loop 19110 1726882557.59784: done getting the remaining hosts for this loop 19110 1726882557.59787: getting the next task for host managed_node1 19110 1726882557.59794: done getting next task for host managed_node1 19110 1726882557.59797: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882557.59799: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882557.59809: getting variables 19110 1726882557.59810: in VariableManager get_vars() 19110 1726882557.59844: Calling all_inventory to load vars for managed_node1 19110 1726882557.59847: Calling groups_inventory to load vars for managed_node1 19110 1726882557.59849: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882557.59858: Calling all_plugins_play to load vars for managed_node1 19110 1726882557.59861: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882557.59866: Calling groups_plugins_play to load vars for managed_node1 19110 1726882557.61555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.64544: done with get_vars() 19110 1726882557.64692: done getting variables 19110 1726882557.64754: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:57 -0400 (0:00:02.000) 0:00:14.506 ****** 19110 1726882557.64907: entering _queue_task() for managed_node1/debug 19110 1726882557.65682: worker is 1 (out of 1 available) 19110 1726882557.65695: exiting _queue_task() for managed_node1/debug 19110 1726882557.65708: done queuing things up, now waiting for results queue to drain 19110 1726882557.65710: waiting for pending results... 19110 1726882557.66408: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882557.66525: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001a 19110 1726882557.66548: variable 'ansible_search_path' from source: unknown 19110 1726882557.66561: variable 'ansible_search_path' from source: unknown 19110 1726882557.66613: calling self._execute() 19110 1726882557.66710: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.66727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.66741: variable 'omit' from source: magic vars 19110 1726882557.67188: variable 'ansible_distribution_major_version' from source: facts 19110 1726882557.67209: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882557.67218: variable 'omit' from source: magic vars 19110 1726882557.67267: variable 'omit' from source: magic vars 19110 1726882557.67384: variable 'network_provider' from source: set_fact 19110 1726882557.67405: variable 'omit' from source: magic vars 19110 1726882557.67462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882557.67509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882557.67532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882557.67560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882557.67585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882557.67616: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882557.67625: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.67636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.67762: Set connection var ansible_timeout to 10 19110 1726882557.67783: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882557.67799: Set connection var ansible_shell_executable to /bin/sh 19110 1726882557.67806: Set connection var ansible_shell_type to sh 19110 1726882557.67812: Set connection var ansible_connection to ssh 19110 1726882557.67820: Set connection var ansible_pipelining to False 19110 1726882557.67845: variable 'ansible_shell_executable' from source: unknown 19110 1726882557.67852: variable 'ansible_connection' from source: unknown 19110 1726882557.67861: variable 'ansible_module_compression' from source: unknown 19110 1726882557.67874: variable 'ansible_shell_type' from source: unknown 19110 1726882557.67884: variable 'ansible_shell_executable' from source: unknown 19110 1726882557.67891: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.67898: variable 'ansible_pipelining' from source: unknown 19110 1726882557.67911: variable 'ansible_timeout' from source: unknown 19110 1726882557.67918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.68195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882557.68283: variable 'omit' from source: magic vars 19110 1726882557.68294: starting attempt loop 19110 1726882557.68301: running the handler 19110 1726882557.68483: handler run complete 19110 1726882557.68503: attempt loop complete, returning result 19110 1726882557.68510: _execute() done 19110 1726882557.68520: dumping result to json 19110 1726882557.68529: done dumping result, returning 19110 1726882557.68648: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-5372-c19a-00000000001a] 19110 1726882557.68661: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001a ok: [managed_node1] => {} MSG: Using network provider: nm 19110 1726882557.68972: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001a 19110 1726882557.68976: WORKER PROCESS EXITING 19110 1726882557.68984: no more pending results, returning what we have 19110 1726882557.68987: results queue empty 19110 1726882557.68988: checking for any_errors_fatal 19110 1726882557.69001: done checking for any_errors_fatal 19110 1726882557.69002: checking for max_fail_percentage 19110 1726882557.69003: done checking for max_fail_percentage 19110 1726882557.69004: checking to see if all hosts have failed and the running result is not ok 19110 1726882557.69005: done checking to see if all hosts have failed 19110 1726882557.69006: getting the remaining hosts for this loop 19110 1726882557.69007: done getting the remaining hosts for this loop 19110 1726882557.69012: getting the next task for host managed_node1 19110 1726882557.69018: done getting next task for host managed_node1 19110 1726882557.69022: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882557.69024: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882557.69033: getting variables 19110 1726882557.69035: in VariableManager get_vars() 19110 1726882557.69080: Calling all_inventory to load vars for managed_node1 19110 1726882557.69083: Calling groups_inventory to load vars for managed_node1 19110 1726882557.69085: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882557.69095: Calling all_plugins_play to load vars for managed_node1 19110 1726882557.69097: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882557.69099: Calling groups_plugins_play to load vars for managed_node1 19110 1726882557.71937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.75497: done with get_vars() 19110 1726882557.75528: done getting variables 19110 1726882557.75712: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:57 -0400 (0:00:00.108) 0:00:14.614 ****** 19110 1726882557.75743: entering _queue_task() for managed_node1/fail 19110 1726882557.76325: worker is 1 (out of 1 available) 19110 1726882557.76339: exiting _queue_task() for managed_node1/fail 19110 1726882557.76725: done queuing things up, now waiting for results queue to drain 19110 1726882557.76728: waiting for pending results... 19110 1726882557.77129: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882557.77233: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001b 19110 1726882557.77247: variable 'ansible_search_path' from source: unknown 19110 1726882557.77249: variable 'ansible_search_path' from source: unknown 19110 1726882557.77300: calling self._execute() 19110 1726882557.77395: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.77406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.77410: variable 'omit' from source: magic vars 19110 1726882557.77803: variable 'ansible_distribution_major_version' from source: facts 19110 1726882557.77815: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882557.77948: variable 'network_state' from source: role '' defaults 19110 1726882557.77959: Evaluated conditional (network_state != {}): False 19110 1726882557.77963: when evaluation is False, skipping this task 19110 1726882557.77967: _execute() done 19110 1726882557.77970: dumping result to json 19110 1726882557.77972: done dumping result, returning 19110 1726882557.77977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-5372-c19a-00000000001b] 19110 1726882557.77984: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882557.78136: no more pending results, returning what we have 19110 1726882557.78140: results queue empty 19110 1726882557.78141: checking for any_errors_fatal 19110 1726882557.78148: done checking for any_errors_fatal 19110 1726882557.78149: checking for max_fail_percentage 19110 1726882557.78151: done checking for max_fail_percentage 19110 1726882557.78152: checking to see if all hosts have failed and the running result is not ok 19110 1726882557.78153: done checking to see if all hosts have failed 19110 1726882557.78154: getting the remaining hosts for this loop 19110 1726882557.78156: done getting the remaining hosts for this loop 19110 1726882557.78160: getting the next task for host managed_node1 19110 1726882557.78169: done getting next task for host managed_node1 19110 1726882557.78173: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882557.78177: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882557.78197: getting variables 19110 1726882557.78199: in VariableManager get_vars() 19110 1726882557.78240: Calling all_inventory to load vars for managed_node1 19110 1726882557.78244: Calling groups_inventory to load vars for managed_node1 19110 1726882557.78246: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882557.78261: Calling all_plugins_play to load vars for managed_node1 19110 1726882557.78266: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882557.78270: Calling groups_plugins_play to load vars for managed_node1 19110 1726882557.78902: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001b 19110 1726882557.78905: WORKER PROCESS EXITING 19110 1726882557.81028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.83290: done with get_vars() 19110 1726882557.83335: done getting variables 19110 1726882557.83401: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:57 -0400 (0:00:00.076) 0:00:14.691 ****** 19110 1726882557.83443: entering _queue_task() for managed_node1/fail 19110 1726882557.83790: worker is 1 (out of 1 available) 19110 1726882557.83803: exiting _queue_task() for managed_node1/fail 19110 1726882557.83816: done queuing things up, now waiting for results queue to drain 19110 1726882557.83818: waiting for pending results... 19110 1726882557.84791: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882557.84887: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001c 19110 1726882557.84899: variable 'ansible_search_path' from source: unknown 19110 1726882557.84903: variable 'ansible_search_path' from source: unknown 19110 1726882557.84948: calling self._execute() 19110 1726882557.85047: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.85051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.85062: variable 'omit' from source: magic vars 19110 1726882557.85491: variable 'ansible_distribution_major_version' from source: facts 19110 1726882557.85504: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882557.85633: variable 'network_state' from source: role '' defaults 19110 1726882557.85641: Evaluated conditional (network_state != {}): False 19110 1726882557.85644: when evaluation is False, skipping this task 19110 1726882557.85647: _execute() done 19110 1726882557.85650: dumping result to json 19110 1726882557.85652: done dumping result, returning 19110 1726882557.85662: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-5372-c19a-00000000001c] 19110 1726882557.85674: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001c 19110 1726882557.85771: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001c 19110 1726882557.85776: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882557.85836: no more pending results, returning what we have 19110 1726882557.85840: results queue empty 19110 1726882557.85841: checking for any_errors_fatal 19110 1726882557.85849: done checking for any_errors_fatal 19110 1726882557.85850: checking for max_fail_percentage 19110 1726882557.85851: done checking for max_fail_percentage 19110 1726882557.85852: checking to see if all hosts have failed and the running result is not ok 19110 1726882557.85853: done checking to see if all hosts have failed 19110 1726882557.85854: getting the remaining hosts for this loop 19110 1726882557.85855: done getting the remaining hosts for this loop 19110 1726882557.85859: getting the next task for host managed_node1 19110 1726882557.85869: done getting next task for host managed_node1 19110 1726882557.85873: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882557.85876: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882557.85892: getting variables 19110 1726882557.85895: in VariableManager get_vars() 19110 1726882557.85933: Calling all_inventory to load vars for managed_node1 19110 1726882557.85936: Calling groups_inventory to load vars for managed_node1 19110 1726882557.85938: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882557.85950: Calling all_plugins_play to load vars for managed_node1 19110 1726882557.85953: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882557.85955: Calling groups_plugins_play to load vars for managed_node1 19110 1726882557.88670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882557.90606: done with get_vars() 19110 1726882557.90632: done getting variables 19110 1726882557.90705: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:57 -0400 (0:00:00.072) 0:00:14.764 ****** 19110 1726882557.90736: entering _queue_task() for managed_node1/fail 19110 1726882557.91067: worker is 1 (out of 1 available) 19110 1726882557.91084: exiting _queue_task() for managed_node1/fail 19110 1726882557.91100: done queuing things up, now waiting for results queue to drain 19110 1726882557.91102: waiting for pending results... 19110 1726882557.91388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882557.91533: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001d 19110 1726882557.91538: variable 'ansible_search_path' from source: unknown 19110 1726882557.91557: variable 'ansible_search_path' from source: unknown 19110 1726882557.91584: calling self._execute() 19110 1726882557.91696: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882557.91700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882557.91708: variable 'omit' from source: magic vars 19110 1726882557.92180: variable 'ansible_distribution_major_version' from source: facts 19110 1726882557.92184: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882557.92467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882557.95543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882557.95628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882557.95787: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882557.95827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882557.95853: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882557.96057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882557.96085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882557.96237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882557.96340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882557.96359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882557.96578: variable 'ansible_distribution_major_version' from source: facts 19110 1726882557.96593: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19110 1726882557.96596: when evaluation is False, skipping this task 19110 1726882557.96600: _execute() done 19110 1726882557.96602: dumping result to json 19110 1726882557.96605: done dumping result, returning 19110 1726882557.96614: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-5372-c19a-00000000001d] 19110 1726882557.96620: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001d 19110 1726882557.96884: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001d 19110 1726882557.96888: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19110 1726882557.96939: no more pending results, returning what we have 19110 1726882557.96943: results queue empty 19110 1726882557.96944: checking for any_errors_fatal 19110 1726882557.96951: done checking for any_errors_fatal 19110 1726882557.96952: checking for max_fail_percentage 19110 1726882557.96954: done checking for max_fail_percentage 19110 1726882557.96955: checking to see if all hosts have failed and the running result is not ok 19110 1726882557.96956: done checking to see if all hosts have failed 19110 1726882557.96956: getting the remaining hosts for this loop 19110 1726882557.96958: done getting the remaining hosts for this loop 19110 1726882557.96962: getting the next task for host managed_node1 19110 1726882557.96969: done getting next task for host managed_node1 19110 1726882557.96977: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882557.96980: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882557.96995: getting variables 19110 1726882557.96997: in VariableManager get_vars() 19110 1726882557.97039: Calling all_inventory to load vars for managed_node1 19110 1726882557.97042: Calling groups_inventory to load vars for managed_node1 19110 1726882557.97045: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882557.97056: Calling all_plugins_play to load vars for managed_node1 19110 1726882557.97058: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882557.97061: Calling groups_plugins_play to load vars for managed_node1 19110 1726882558.00728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882558.04687: done with get_vars() 19110 1726882558.04720: done getting variables 19110 1726882558.04836: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:58 -0400 (0:00:00.141) 0:00:14.905 ****** 19110 1726882558.04876: entering _queue_task() for managed_node1/dnf 19110 1726882558.05236: worker is 1 (out of 1 available) 19110 1726882558.05306: exiting _queue_task() for managed_node1/dnf 19110 1726882558.05358: done queuing things up, now waiting for results queue to drain 19110 1726882558.05360: waiting for pending results... 19110 1726882558.06921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882558.07151: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001e 19110 1726882558.07179: variable 'ansible_search_path' from source: unknown 19110 1726882558.07188: variable 'ansible_search_path' from source: unknown 19110 1726882558.07243: calling self._execute() 19110 1726882558.07386: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882558.07401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882558.07423: variable 'omit' from source: magic vars 19110 1726882558.08219: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.08376: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882558.09013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882558.14085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882558.14220: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882558.14289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882558.14343: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882558.14408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882558.14507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.14551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.14598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.14662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.14700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.14882: variable 'ansible_distribution' from source: facts 19110 1726882558.14895: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.14921: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19110 1726882558.15089: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882558.15259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.15299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.15356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.15421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.15460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.15527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.15582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.15840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.15894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.15930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.15983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.16011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.16052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.16107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.16137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.16377: variable 'network_connections' from source: play vars 19110 1726882558.16396: variable 'interface' from source: set_fact 19110 1726882558.16481: variable 'interface' from source: set_fact 19110 1726882558.16494: variable 'interface' from source: set_fact 19110 1726882558.16590: variable 'interface' from source: set_fact 19110 1726882558.16697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882558.16918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882558.16973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882558.17040: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882558.17083: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882558.17147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882558.17180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882558.17225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.17264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882558.17399: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882558.18108: variable 'network_connections' from source: play vars 19110 1726882558.18122: variable 'interface' from source: set_fact 19110 1726882558.18306: variable 'interface' from source: set_fact 19110 1726882558.18331: variable 'interface' from source: set_fact 19110 1726882558.18508: variable 'interface' from source: set_fact 19110 1726882558.18574: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882558.18582: when evaluation is False, skipping this task 19110 1726882558.18615: _execute() done 19110 1726882558.18622: dumping result to json 19110 1726882558.18629: done dumping result, returning 19110 1726882558.18648: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-00000000001e] 19110 1726882558.18707: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882558.18932: no more pending results, returning what we have 19110 1726882558.18936: results queue empty 19110 1726882558.18937: checking for any_errors_fatal 19110 1726882558.18947: done checking for any_errors_fatal 19110 1726882558.18948: checking for max_fail_percentage 19110 1726882558.18950: done checking for max_fail_percentage 19110 1726882558.18951: checking to see if all hosts have failed and the running result is not ok 19110 1726882558.18952: done checking to see if all hosts have failed 19110 1726882558.18953: getting the remaining hosts for this loop 19110 1726882558.18957: done getting the remaining hosts for this loop 19110 1726882558.18962: getting the next task for host managed_node1 19110 1726882558.18970: done getting next task for host managed_node1 19110 1726882558.18977: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882558.18979: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882558.18993: getting variables 19110 1726882558.18995: in VariableManager get_vars() 19110 1726882558.19038: Calling all_inventory to load vars for managed_node1 19110 1726882558.19041: Calling groups_inventory to load vars for managed_node1 19110 1726882558.19044: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882558.19059: Calling all_plugins_play to load vars for managed_node1 19110 1726882558.19062: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882558.19069: Calling groups_plugins_play to load vars for managed_node1 19110 1726882558.20304: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001e 19110 1726882558.20307: WORKER PROCESS EXITING 19110 1726882558.21500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882558.26598: done with get_vars() 19110 1726882558.26643: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882558.26949: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:58 -0400 (0:00:00.221) 0:00:15.127 ****** 19110 1726882558.27035: entering _queue_task() for managed_node1/yum 19110 1726882558.27037: Creating lock for yum 19110 1726882558.28109: worker is 1 (out of 1 available) 19110 1726882558.28128: exiting _queue_task() for managed_node1/yum 19110 1726882558.28141: done queuing things up, now waiting for results queue to drain 19110 1726882558.28143: waiting for pending results... 19110 1726882558.28976: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882558.29125: in run() - task 0e448fcc-3ce9-5372-c19a-00000000001f 19110 1726882558.29146: variable 'ansible_search_path' from source: unknown 19110 1726882558.29158: variable 'ansible_search_path' from source: unknown 19110 1726882558.29222: calling self._execute() 19110 1726882558.29369: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882558.29488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882558.29544: variable 'omit' from source: magic vars 19110 1726882558.31929: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.32080: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882558.32512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882558.46622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882558.46707: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882558.46872: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882558.46981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882558.47011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882558.47131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.47403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.47434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.47533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.47719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.47905: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.48153: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19110 1726882558.48166: when evaluation is False, skipping this task 19110 1726882558.48173: _execute() done 19110 1726882558.48179: dumping result to json 19110 1726882558.48186: done dumping result, returning 19110 1726882558.48197: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-00000000001f] 19110 1726882558.48205: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19110 1726882558.48368: no more pending results, returning what we have 19110 1726882558.48371: results queue empty 19110 1726882558.48372: checking for any_errors_fatal 19110 1726882558.48379: done checking for any_errors_fatal 19110 1726882558.48380: checking for max_fail_percentage 19110 1726882558.48382: done checking for max_fail_percentage 19110 1726882558.48382: checking to see if all hosts have failed and the running result is not ok 19110 1726882558.48383: done checking to see if all hosts have failed 19110 1726882558.48384: getting the remaining hosts for this loop 19110 1726882558.48386: done getting the remaining hosts for this loop 19110 1726882558.48389: getting the next task for host managed_node1 19110 1726882558.48395: done getting next task for host managed_node1 19110 1726882558.48399: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882558.48402: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882558.48414: getting variables 19110 1726882558.48416: in VariableManager get_vars() 19110 1726882558.48454: Calling all_inventory to load vars for managed_node1 19110 1726882558.48460: Calling groups_inventory to load vars for managed_node1 19110 1726882558.48462: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882558.48473: Calling all_plugins_play to load vars for managed_node1 19110 1726882558.48476: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882558.48478: Calling groups_plugins_play to load vars for managed_node1 19110 1726882558.49554: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000001f 19110 1726882558.49561: WORKER PROCESS EXITING 19110 1726882558.61686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882558.66797: done with get_vars() 19110 1726882558.66831: done getting variables 19110 1726882558.67086: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:58 -0400 (0:00:00.400) 0:00:15.528 ****** 19110 1726882558.67120: entering _queue_task() for managed_node1/fail 19110 1726882558.67793: worker is 1 (out of 1 available) 19110 1726882558.67805: exiting _queue_task() for managed_node1/fail 19110 1726882558.67817: done queuing things up, now waiting for results queue to drain 19110 1726882558.67820: waiting for pending results... 19110 1726882558.71835: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882558.72321: in run() - task 0e448fcc-3ce9-5372-c19a-000000000020 19110 1726882558.72335: variable 'ansible_search_path' from source: unknown 19110 1726882558.72339: variable 'ansible_search_path' from source: unknown 19110 1726882558.72382: calling self._execute() 19110 1726882558.72703: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882558.72707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882558.72718: variable 'omit' from source: magic vars 19110 1726882558.73565: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.73577: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882558.73819: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882558.74425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882558.79689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882558.79886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882558.79921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882558.80075: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882558.80100: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882558.80297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.80325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.80349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.80509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.80524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.80570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.80713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.80735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.80774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.80787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.80947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882558.80972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882558.80996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.81158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882558.81171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882558.81594: variable 'network_connections' from source: play vars 19110 1726882558.81607: variable 'interface' from source: set_fact 19110 1726882558.81905: variable 'interface' from source: set_fact 19110 1726882558.81918: variable 'interface' from source: set_fact 19110 1726882558.82009: variable 'interface' from source: set_fact 19110 1726882558.82176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882558.82679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882558.82718: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882558.82786: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882558.83623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882558.83668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882558.83696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882558.83948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882558.83976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882558.84038: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882558.84966: variable 'network_connections' from source: play vars 19110 1726882558.84970: variable 'interface' from source: set_fact 19110 1726882558.85159: variable 'interface' from source: set_fact 19110 1726882558.85162: variable 'interface' from source: set_fact 19110 1726882558.85224: variable 'interface' from source: set_fact 19110 1726882558.85480: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882558.85484: when evaluation is False, skipping this task 19110 1726882558.85486: _execute() done 19110 1726882558.85489: dumping result to json 19110 1726882558.85491: done dumping result, returning 19110 1726882558.85501: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000020] 19110 1726882558.85512: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882558.85652: no more pending results, returning what we have 19110 1726882558.85656: results queue empty 19110 1726882558.85657: checking for any_errors_fatal 19110 1726882558.85667: done checking for any_errors_fatal 19110 1726882558.85668: checking for max_fail_percentage 19110 1726882558.85670: done checking for max_fail_percentage 19110 1726882558.85671: checking to see if all hosts have failed and the running result is not ok 19110 1726882558.85671: done checking to see if all hosts have failed 19110 1726882558.85672: getting the remaining hosts for this loop 19110 1726882558.85675: done getting the remaining hosts for this loop 19110 1726882558.85679: getting the next task for host managed_node1 19110 1726882558.85685: done getting next task for host managed_node1 19110 1726882558.85690: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19110 1726882558.85692: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882558.85709: getting variables 19110 1726882558.85711: in VariableManager get_vars() 19110 1726882558.85751: Calling all_inventory to load vars for managed_node1 19110 1726882558.85754: Calling groups_inventory to load vars for managed_node1 19110 1726882558.85757: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882558.85770: Calling all_plugins_play to load vars for managed_node1 19110 1726882558.85773: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882558.85777: Calling groups_plugins_play to load vars for managed_node1 19110 1726882558.86306: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000020 19110 1726882558.86310: WORKER PROCESS EXITING 19110 1726882558.89088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882558.94136: done with get_vars() 19110 1726882558.94272: done getting variables 19110 1726882558.94384: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:58 -0400 (0:00:00.272) 0:00:15.801 ****** 19110 1726882558.94415: entering _queue_task() for managed_node1/package 19110 1726882558.94872: worker is 1 (out of 1 available) 19110 1726882558.94883: exiting _queue_task() for managed_node1/package 19110 1726882558.94895: done queuing things up, now waiting for results queue to drain 19110 1726882558.94897: waiting for pending results... 19110 1726882558.95396: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 19110 1726882558.95681: in run() - task 0e448fcc-3ce9-5372-c19a-000000000021 19110 1726882558.95703: variable 'ansible_search_path' from source: unknown 19110 1726882558.95711: variable 'ansible_search_path' from source: unknown 19110 1726882558.95757: calling self._execute() 19110 1726882558.95990: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882558.96117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882558.96133: variable 'omit' from source: magic vars 19110 1726882558.96632: variable 'ansible_distribution_major_version' from source: facts 19110 1726882558.96660: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882558.96902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882558.97294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882558.98518: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882558.98564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882558.98643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882558.98781: variable 'network_packages' from source: role '' defaults 19110 1726882558.98968: variable '__network_provider_setup' from source: role '' defaults 19110 1726882558.98984: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882558.99057: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882558.99076: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882558.99143: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882558.99950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882559.02976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882559.03049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882559.03095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882559.03133: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882559.03181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882559.03267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.03303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.03335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.03385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.03406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.03453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.03489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.03520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.03569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.03590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.03822: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882559.03940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.03975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.04005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.04059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.04086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.04176: variable 'ansible_python' from source: facts 19110 1726882559.04205: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882559.04291: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882559.04378: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882559.04509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.04726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.04761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.04809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.04830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.04883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.05037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.05071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.05112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.05128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.05676: variable 'network_connections' from source: play vars 19110 1726882559.05688: variable 'interface' from source: set_fact 19110 1726882559.05793: variable 'interface' from source: set_fact 19110 1726882559.05806: variable 'interface' from source: set_fact 19110 1726882559.05902: variable 'interface' from source: set_fact 19110 1726882559.05975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882559.06034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882559.06984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.07024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882559.07085: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882559.07391: variable 'network_connections' from source: play vars 19110 1726882559.08076: variable 'interface' from source: set_fact 19110 1726882559.08183: variable 'interface' from source: set_fact 19110 1726882559.08196: variable 'interface' from source: set_fact 19110 1726882559.08295: variable 'interface' from source: set_fact 19110 1726882559.08350: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882559.08428: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882559.08751: variable 'network_connections' from source: play vars 19110 1726882559.09176: variable 'interface' from source: set_fact 19110 1726882559.09260: variable 'interface' from source: set_fact 19110 1726882559.09277: variable 'interface' from source: set_fact 19110 1726882559.09369: variable 'interface' from source: set_fact 19110 1726882559.09413: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882559.09513: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882559.09962: variable 'network_connections' from source: play vars 19110 1726882559.09981: variable 'interface' from source: set_fact 19110 1726882559.10066: variable 'interface' from source: set_fact 19110 1726882559.10084: variable 'interface' from source: set_fact 19110 1726882559.10171: variable 'interface' from source: set_fact 19110 1726882559.10265: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882559.10346: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882559.10370: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882559.10431: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882559.10714: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882559.11308: variable 'network_connections' from source: play vars 19110 1726882559.11318: variable 'interface' from source: set_fact 19110 1726882559.11384: variable 'interface' from source: set_fact 19110 1726882559.11399: variable 'interface' from source: set_fact 19110 1726882559.11490: variable 'interface' from source: set_fact 19110 1726882559.11513: variable 'ansible_distribution' from source: facts 19110 1726882559.11523: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.11539: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.11579: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882559.11787: variable 'ansible_distribution' from source: facts 19110 1726882559.11801: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.11821: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.11839: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882559.12062: variable 'ansible_distribution' from source: facts 19110 1726882559.12076: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.12094: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.12143: variable 'network_provider' from source: set_fact 19110 1726882559.12168: variable 'ansible_facts' from source: unknown 19110 1726882559.13487: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19110 1726882559.13509: when evaluation is False, skipping this task 19110 1726882559.13522: _execute() done 19110 1726882559.13541: dumping result to json 19110 1726882559.13577: done dumping result, returning 19110 1726882559.13615: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-5372-c19a-000000000021] 19110 1726882559.13638: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19110 1726882559.13889: no more pending results, returning what we have 19110 1726882559.13892: results queue empty 19110 1726882559.13893: checking for any_errors_fatal 19110 1726882559.13900: done checking for any_errors_fatal 19110 1726882559.13907: checking for max_fail_percentage 19110 1726882559.13914: done checking for max_fail_percentage 19110 1726882559.13916: checking to see if all hosts have failed and the running result is not ok 19110 1726882559.13916: done checking to see if all hosts have failed 19110 1726882559.13917: getting the remaining hosts for this loop 19110 1726882559.13919: done getting the remaining hosts for this loop 19110 1726882559.13922: getting the next task for host managed_node1 19110 1726882559.13929: done getting next task for host managed_node1 19110 1726882559.13933: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882559.13935: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882559.13995: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000021 19110 1726882559.13998: WORKER PROCESS EXITING 19110 1726882559.14008: getting variables 19110 1726882559.14010: in VariableManager get_vars() 19110 1726882559.14074: Calling all_inventory to load vars for managed_node1 19110 1726882559.14077: Calling groups_inventory to load vars for managed_node1 19110 1726882559.14079: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882559.14095: Calling all_plugins_play to load vars for managed_node1 19110 1726882559.14099: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882559.14121: Calling groups_plugins_play to load vars for managed_node1 19110 1726882559.18135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882559.19978: done with get_vars() 19110 1726882559.20008: done getting variables 19110 1726882559.20085: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:59 -0400 (0:00:00.257) 0:00:16.058 ****** 19110 1726882559.20119: entering _queue_task() for managed_node1/package 19110 1726882559.20451: worker is 1 (out of 1 available) 19110 1726882559.20467: exiting _queue_task() for managed_node1/package 19110 1726882559.20484: done queuing things up, now waiting for results queue to drain 19110 1726882559.20486: waiting for pending results... 19110 1726882559.20767: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882559.20895: in run() - task 0e448fcc-3ce9-5372-c19a-000000000022 19110 1726882559.20922: variable 'ansible_search_path' from source: unknown 19110 1726882559.20933: variable 'ansible_search_path' from source: unknown 19110 1726882559.20979: calling self._execute() 19110 1726882559.21087: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.21098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.21114: variable 'omit' from source: magic vars 19110 1726882559.21600: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.21621: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882559.21781: variable 'network_state' from source: role '' defaults 19110 1726882559.21816: Evaluated conditional (network_state != {}): False 19110 1726882559.21831: when evaluation is False, skipping this task 19110 1726882559.21841: _execute() done 19110 1726882559.21848: dumping result to json 19110 1726882559.21866: done dumping result, returning 19110 1726882559.21893: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000022] 19110 1726882559.21920: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000022 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882559.22099: no more pending results, returning what we have 19110 1726882559.22104: results queue empty 19110 1726882559.22105: checking for any_errors_fatal 19110 1726882559.22122: done checking for any_errors_fatal 19110 1726882559.22123: checking for max_fail_percentage 19110 1726882559.22126: done checking for max_fail_percentage 19110 1726882559.22127: checking to see if all hosts have failed and the running result is not ok 19110 1726882559.22127: done checking to see if all hosts have failed 19110 1726882559.22128: getting the remaining hosts for this loop 19110 1726882559.22130: done getting the remaining hosts for this loop 19110 1726882559.22134: getting the next task for host managed_node1 19110 1726882559.22140: done getting next task for host managed_node1 19110 1726882559.22148: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882559.22151: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882559.22173: getting variables 19110 1726882559.22175: in VariableManager get_vars() 19110 1726882559.22226: Calling all_inventory to load vars for managed_node1 19110 1726882559.22231: Calling groups_inventory to load vars for managed_node1 19110 1726882559.22237: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882559.22253: Calling all_plugins_play to load vars for managed_node1 19110 1726882559.22261: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882559.22267: Calling groups_plugins_play to load vars for managed_node1 19110 1726882559.23231: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000022 19110 1726882559.23234: WORKER PROCESS EXITING 19110 1726882559.25193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882559.27106: done with get_vars() 19110 1726882559.27127: done getting variables 19110 1726882559.27303: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:59 -0400 (0:00:00.072) 0:00:16.130 ****** 19110 1726882559.27330: entering _queue_task() for managed_node1/package 19110 1726882559.27823: worker is 1 (out of 1 available) 19110 1726882559.27950: exiting _queue_task() for managed_node1/package 19110 1726882559.27967: done queuing things up, now waiting for results queue to drain 19110 1726882559.27969: waiting for pending results... 19110 1726882559.28560: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882559.28704: in run() - task 0e448fcc-3ce9-5372-c19a-000000000023 19110 1726882559.28740: variable 'ansible_search_path' from source: unknown 19110 1726882559.28761: variable 'ansible_search_path' from source: unknown 19110 1726882559.28846: calling self._execute() 19110 1726882559.28977: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.28989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.29003: variable 'omit' from source: magic vars 19110 1726882559.29429: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.29453: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882559.29596: variable 'network_state' from source: role '' defaults 19110 1726882559.29610: Evaluated conditional (network_state != {}): False 19110 1726882559.29622: when evaluation is False, skipping this task 19110 1726882559.29630: _execute() done 19110 1726882559.29638: dumping result to json 19110 1726882559.29646: done dumping result, returning 19110 1726882559.29663: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000023] 19110 1726882559.29678: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000023 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882559.29880: no more pending results, returning what we have 19110 1726882559.29889: results queue empty 19110 1726882559.29891: checking for any_errors_fatal 19110 1726882559.29899: done checking for any_errors_fatal 19110 1726882559.29900: checking for max_fail_percentage 19110 1726882559.29903: done checking for max_fail_percentage 19110 1726882559.29904: checking to see if all hosts have failed and the running result is not ok 19110 1726882559.29905: done checking to see if all hosts have failed 19110 1726882559.29905: getting the remaining hosts for this loop 19110 1726882559.29907: done getting the remaining hosts for this loop 19110 1726882559.29912: getting the next task for host managed_node1 19110 1726882559.29917: done getting next task for host managed_node1 19110 1726882559.29922: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882559.29924: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882559.29937: getting variables 19110 1726882559.29939: in VariableManager get_vars() 19110 1726882559.29983: Calling all_inventory to load vars for managed_node1 19110 1726882559.29986: Calling groups_inventory to load vars for managed_node1 19110 1726882559.29989: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882559.30000: Calling all_plugins_play to load vars for managed_node1 19110 1726882559.30003: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882559.30007: Calling groups_plugins_play to load vars for managed_node1 19110 1726882559.31662: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000023 19110 1726882559.31666: WORKER PROCESS EXITING 19110 1726882559.33299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882559.36846: done with get_vars() 19110 1726882559.36894: done getting variables 19110 1726882559.37097: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:59 -0400 (0:00:00.098) 0:00:16.228 ****** 19110 1726882559.37137: entering _queue_task() for managed_node1/service 19110 1726882559.37140: Creating lock for service 19110 1726882559.37475: worker is 1 (out of 1 available) 19110 1726882559.37487: exiting _queue_task() for managed_node1/service 19110 1726882559.37498: done queuing things up, now waiting for results queue to drain 19110 1726882559.37499: waiting for pending results... 19110 1726882559.37911: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882559.38081: in run() - task 0e448fcc-3ce9-5372-c19a-000000000024 19110 1726882559.38112: variable 'ansible_search_path' from source: unknown 19110 1726882559.38135: variable 'ansible_search_path' from source: unknown 19110 1726882559.38215: calling self._execute() 19110 1726882559.39073: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.39091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.39152: variable 'omit' from source: magic vars 19110 1726882559.40242: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.40262: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882559.40494: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882559.41184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882559.45737: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882559.45844: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882559.45897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882559.45934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882559.45970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882559.46061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.46095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.46127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.46169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.46185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.46236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.46267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.46298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.46347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.47018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.47069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.47248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.47284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.47322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.47343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.47648: variable 'network_connections' from source: play vars 19110 1726882559.47722: variable 'interface' from source: set_fact 19110 1726882559.47816: variable 'interface' from source: set_fact 19110 1726882559.47834: variable 'interface' from source: set_fact 19110 1726882559.47913: variable 'interface' from source: set_fact 19110 1726882559.48002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882559.48213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882559.48266: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882559.48303: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882559.48340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882559.48396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882559.48428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882559.48461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.48499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882559.48572: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882559.48844: variable 'network_connections' from source: play vars 19110 1726882559.48861: variable 'interface' from source: set_fact 19110 1726882559.48930: variable 'interface' from source: set_fact 19110 1726882559.48942: variable 'interface' from source: set_fact 19110 1726882559.49010: variable 'interface' from source: set_fact 19110 1726882559.49053: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882559.49067: when evaluation is False, skipping this task 19110 1726882559.49077: _execute() done 19110 1726882559.49086: dumping result to json 19110 1726882559.49094: done dumping result, returning 19110 1726882559.49106: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000024] 19110 1726882559.49125: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882559.49288: no more pending results, returning what we have 19110 1726882559.49293: results queue empty 19110 1726882559.49294: checking for any_errors_fatal 19110 1726882559.49305: done checking for any_errors_fatal 19110 1726882559.49306: checking for max_fail_percentage 19110 1726882559.49307: done checking for max_fail_percentage 19110 1726882559.49308: checking to see if all hosts have failed and the running result is not ok 19110 1726882559.49309: done checking to see if all hosts have failed 19110 1726882559.49310: getting the remaining hosts for this loop 19110 1726882559.49312: done getting the remaining hosts for this loop 19110 1726882559.49315: getting the next task for host managed_node1 19110 1726882559.49322: done getting next task for host managed_node1 19110 1726882559.49326: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882559.49328: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882559.49342: getting variables 19110 1726882559.49344: in VariableManager get_vars() 19110 1726882559.49390: Calling all_inventory to load vars for managed_node1 19110 1726882559.49394: Calling groups_inventory to load vars for managed_node1 19110 1726882559.49396: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882559.49408: Calling all_plugins_play to load vars for managed_node1 19110 1726882559.49411: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882559.49415: Calling groups_plugins_play to load vars for managed_node1 19110 1726882559.50415: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000024 19110 1726882559.50419: WORKER PROCESS EXITING 19110 1726882559.51317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882559.54082: done with get_vars() 19110 1726882559.54108: done getting variables 19110 1726882559.54173: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:59 -0400 (0:00:00.170) 0:00:16.399 ****** 19110 1726882559.54204: entering _queue_task() for managed_node1/service 19110 1726882559.54519: worker is 1 (out of 1 available) 19110 1726882559.54531: exiting _queue_task() for managed_node1/service 19110 1726882559.54543: done queuing things up, now waiting for results queue to drain 19110 1726882559.54544: waiting for pending results... 19110 1726882559.54827: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882559.54946: in run() - task 0e448fcc-3ce9-5372-c19a-000000000025 19110 1726882559.54973: variable 'ansible_search_path' from source: unknown 19110 1726882559.54981: variable 'ansible_search_path' from source: unknown 19110 1726882559.55029: calling self._execute() 19110 1726882559.55131: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.55143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.55160: variable 'omit' from source: magic vars 19110 1726882559.55554: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.55579: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882559.55752: variable 'network_provider' from source: set_fact 19110 1726882559.55767: variable 'network_state' from source: role '' defaults 19110 1726882559.55785: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19110 1726882559.55795: variable 'omit' from source: magic vars 19110 1726882559.55833: variable 'omit' from source: magic vars 19110 1726882559.55896: variable 'network_service_name' from source: role '' defaults 19110 1726882559.55974: variable 'network_service_name' from source: role '' defaults 19110 1726882559.56170: variable '__network_provider_setup' from source: role '' defaults 19110 1726882559.56181: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882559.56257: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882559.56319: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882559.56489: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882559.56977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882559.63571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882559.63748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882559.63919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882559.63963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882559.64104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882559.64195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.64340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.64379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.64444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.64554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.64609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.64752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.64792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.64838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.64878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.65353: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882559.65647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.65749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.65780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.65873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.65953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.66164: variable 'ansible_python' from source: facts 19110 1726882559.66191: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882559.66393: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882559.66595: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882559.66842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.66877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.66908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.67074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.67094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.67166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882559.67272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882559.67301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.67396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882559.67480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882559.67843: variable 'network_connections' from source: play vars 19110 1726882559.67860: variable 'interface' from source: set_fact 19110 1726882559.67958: variable 'interface' from source: set_fact 19110 1726882559.68071: variable 'interface' from source: set_fact 19110 1726882559.68247: variable 'interface' from source: set_fact 19110 1726882559.68484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882559.69710: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882559.69860: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882559.69960: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882559.70177: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882559.70304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882559.70531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882559.70577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882559.70615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882559.70717: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882559.71307: variable 'network_connections' from source: play vars 19110 1726882559.71436: variable 'interface' from source: set_fact 19110 1726882559.71512: variable 'interface' from source: set_fact 19110 1726882559.71526: variable 'interface' from source: set_fact 19110 1726882559.71718: variable 'interface' from source: set_fact 19110 1726882559.71775: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882559.71940: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882559.72487: variable 'network_connections' from source: play vars 19110 1726882559.72627: variable 'interface' from source: set_fact 19110 1726882559.72704: variable 'interface' from source: set_fact 19110 1726882559.72844: variable 'interface' from source: set_fact 19110 1726882559.72921: variable 'interface' from source: set_fact 19110 1726882559.73068: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882559.73148: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882559.73684: variable 'network_connections' from source: play vars 19110 1726882559.73821: variable 'interface' from source: set_fact 19110 1726882559.73949: variable 'interface' from source: set_fact 19110 1726882559.73962: variable 'interface' from source: set_fact 19110 1726882559.74101: variable 'interface' from source: set_fact 19110 1726882559.74203: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882559.74411: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882559.74423: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882559.74498: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882559.74730: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882559.75268: variable 'network_connections' from source: play vars 19110 1726882559.75278: variable 'interface' from source: set_fact 19110 1726882559.75346: variable 'interface' from source: set_fact 19110 1726882559.75361: variable 'interface' from source: set_fact 19110 1726882559.75425: variable 'interface' from source: set_fact 19110 1726882559.75451: variable 'ansible_distribution' from source: facts 19110 1726882559.75465: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.75476: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.75502: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882559.75710: variable 'ansible_distribution' from source: facts 19110 1726882559.75719: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.75729: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.75745: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882559.75937: variable 'ansible_distribution' from source: facts 19110 1726882559.75946: variable '__network_rh_distros' from source: role '' defaults 19110 1726882559.75959: variable 'ansible_distribution_major_version' from source: facts 19110 1726882559.76009: variable 'network_provider' from source: set_fact 19110 1726882559.76038: variable 'omit' from source: magic vars 19110 1726882559.76074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882559.76114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882559.76138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882559.76166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882559.76183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882559.76224: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882559.76234: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.76241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.76359: Set connection var ansible_timeout to 10 19110 1726882559.76382: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882559.76392: Set connection var ansible_shell_executable to /bin/sh 19110 1726882559.76399: Set connection var ansible_shell_type to sh 19110 1726882559.76406: Set connection var ansible_connection to ssh 19110 1726882559.76416: Set connection var ansible_pipelining to False 19110 1726882559.76452: variable 'ansible_shell_executable' from source: unknown 19110 1726882559.76466: variable 'ansible_connection' from source: unknown 19110 1726882559.76475: variable 'ansible_module_compression' from source: unknown 19110 1726882559.76481: variable 'ansible_shell_type' from source: unknown 19110 1726882559.76487: variable 'ansible_shell_executable' from source: unknown 19110 1726882559.76494: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882559.76504: variable 'ansible_pipelining' from source: unknown 19110 1726882559.76510: variable 'ansible_timeout' from source: unknown 19110 1726882559.76517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882559.76639: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882559.76662: variable 'omit' from source: magic vars 19110 1726882559.76675: starting attempt loop 19110 1726882559.76682: running the handler 19110 1726882559.77432: variable 'ansible_facts' from source: unknown 19110 1726882559.78541: _low_level_execute_command(): starting 19110 1726882559.78553: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882559.79308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882559.79322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882559.79340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882559.79370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882559.79414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882559.79427: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882559.79441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882559.79472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882559.79485: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882559.79497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882559.79509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882559.79523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882559.79539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882559.79553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882559.79575: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882559.79591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882559.79669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882559.79698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882559.79715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882559.79853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882559.81539: stdout chunk (state=3): >>>/root <<< 19110 1726882559.81679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882559.81739: stderr chunk (state=3): >>><<< 19110 1726882559.81743: stdout chunk (state=3): >>><<< 19110 1726882559.81853: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882559.81860: _low_level_execute_command(): starting 19110 1726882559.81863: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273 `" && echo ansible-tmp-1726882559.8176663-19873-33796278114273="` echo /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273 `" ) && sleep 0' 19110 1726882559.83282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882559.83380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882559.83447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882559.83523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882559.83692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882559.85569: stdout chunk (state=3): >>>ansible-tmp-1726882559.8176663-19873-33796278114273=/root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273 <<< 19110 1726882559.85681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882559.85786: stderr chunk (state=3): >>><<< 19110 1726882559.85789: stdout chunk (state=3): >>><<< 19110 1726882559.85869: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882559.8176663-19873-33796278114273=/root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882559.85878: variable 'ansible_module_compression' from source: unknown 19110 1726882559.86173: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 19110 1726882559.86177: ANSIBALLZ: Acquiring lock 19110 1726882559.86179: ANSIBALLZ: Lock acquired: 139855634067296 19110 1726882559.86181: ANSIBALLZ: Creating module 19110 1726882560.29291: ANSIBALLZ: Writing module into payload 19110 1726882560.29509: ANSIBALLZ: Writing module 19110 1726882560.29547: ANSIBALLZ: Renaming module 19110 1726882560.29566: ANSIBALLZ: Done creating module 19110 1726882560.29608: variable 'ansible_facts' from source: unknown 19110 1726882560.29801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/AnsiballZ_systemd.py 19110 1726882560.29969: Sending initial data 19110 1726882560.29972: Sent initial data (155 bytes) 19110 1726882560.32185: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882560.32189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.32220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882560.32223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.32226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.32419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882560.32499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882560.32693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882560.34454: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882560.34543: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882560.34641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpyxb1bnxr /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/AnsiballZ_systemd.py <<< 19110 1726882560.34733: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882560.38210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882560.38217: stderr chunk (state=3): >>><<< 19110 1726882560.38220: stdout chunk (state=3): >>><<< 19110 1726882560.38245: done transferring module to remote 19110 1726882560.38260: _low_level_execute_command(): starting 19110 1726882560.38266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/ /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/AnsiballZ_systemd.py && sleep 0' 19110 1726882560.39517: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.39521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.39710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.39714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.39729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882560.39734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.40515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882560.40519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882560.40533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882560.40657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882560.42538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882560.42541: stderr chunk (state=3): >>><<< 19110 1726882560.42546: stdout chunk (state=3): >>><<< 19110 1726882560.42562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882560.42567: _low_level_execute_command(): starting 19110 1726882560.42573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/AnsiballZ_systemd.py && sleep 0' 19110 1726882560.43217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.43223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.43262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882560.43270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.43289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882560.43294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.43302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882560.43307: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.43387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882560.43396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882560.43412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882560.43543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882560.68632: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 19110 1726882560.68678: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16138240", "MemoryAvailable": "infinity", "CPUUsageNSec": "913895000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 19110 1726882560.68685: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19110 1726882560.70172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882560.70288: stderr chunk (state=3): >>><<< 19110 1726882560.70291: stdout chunk (state=3): >>><<< 19110 1726882560.70378: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16138240", "MemoryAvailable": "infinity", "CPUUsageNSec": "913895000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882560.70535: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882560.70566: _low_level_execute_command(): starting 19110 1726882560.70577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882559.8176663-19873-33796278114273/ > /dev/null 2>&1 && sleep 0' 19110 1726882560.71319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882560.71336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.71368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882560.71396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.71453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882560.71471: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882560.71493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.71511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882560.71523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882560.71539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882560.71556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882560.71587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882560.71606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882560.71619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882560.71631: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882560.71657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882560.71746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882560.71777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882560.71792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882560.71919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882560.73783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882560.73786: stdout chunk (state=3): >>><<< 19110 1726882560.73793: stderr chunk (state=3): >>><<< 19110 1726882560.73824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882560.73831: handler run complete 19110 1726882560.73892: attempt loop complete, returning result 19110 1726882560.73897: _execute() done 19110 1726882560.73899: dumping result to json 19110 1726882560.73914: done dumping result, returning 19110 1726882560.73924: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-5372-c19a-000000000025] 19110 1726882560.73929: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000025 19110 1726882560.74212: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000025 19110 1726882560.74215: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882560.74269: no more pending results, returning what we have 19110 1726882560.74272: results queue empty 19110 1726882560.74273: checking for any_errors_fatal 19110 1726882560.74281: done checking for any_errors_fatal 19110 1726882560.74282: checking for max_fail_percentage 19110 1726882560.74284: done checking for max_fail_percentage 19110 1726882560.74284: checking to see if all hosts have failed and the running result is not ok 19110 1726882560.74285: done checking to see if all hosts have failed 19110 1726882560.74286: getting the remaining hosts for this loop 19110 1726882560.74287: done getting the remaining hosts for this loop 19110 1726882560.74290: getting the next task for host managed_node1 19110 1726882560.74295: done getting next task for host managed_node1 19110 1726882560.74300: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882560.74301: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882560.74310: getting variables 19110 1726882560.74312: in VariableManager get_vars() 19110 1726882560.74346: Calling all_inventory to load vars for managed_node1 19110 1726882560.74349: Calling groups_inventory to load vars for managed_node1 19110 1726882560.74351: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882560.74362: Calling all_plugins_play to load vars for managed_node1 19110 1726882560.74372: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882560.74375: Calling groups_plugins_play to load vars for managed_node1 19110 1726882560.76288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882560.78466: done with get_vars() 19110 1726882560.78490: done getting variables 19110 1726882560.78605: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:00 -0400 (0:00:01.244) 0:00:17.643 ****** 19110 1726882560.78669: entering _queue_task() for managed_node1/service 19110 1726882560.79437: worker is 1 (out of 1 available) 19110 1726882560.79449: exiting _queue_task() for managed_node1/service 19110 1726882560.79495: done queuing things up, now waiting for results queue to drain 19110 1726882560.79497: waiting for pending results... 19110 1726882560.79902: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882560.80049: in run() - task 0e448fcc-3ce9-5372-c19a-000000000026 19110 1726882560.80063: variable 'ansible_search_path' from source: unknown 19110 1726882560.80073: variable 'ansible_search_path' from source: unknown 19110 1726882560.80174: calling self._execute() 19110 1726882560.80265: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882560.80269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882560.80280: variable 'omit' from source: magic vars 19110 1726882560.80686: variable 'ansible_distribution_major_version' from source: facts 19110 1726882560.80707: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882560.80843: variable 'network_provider' from source: set_fact 19110 1726882560.80866: Evaluated conditional (network_provider == "nm"): True 19110 1726882560.80987: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882560.81075: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882560.81244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882560.83944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882560.84054: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882560.84088: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882560.84204: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882560.84254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882560.84490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882560.84521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882560.84545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882560.84587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882560.84606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882560.84648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882560.84674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882560.84704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882560.84885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882560.84904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882560.85126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882560.85162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882560.85246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882560.85298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882560.85313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882560.85581: variable 'network_connections' from source: play vars 19110 1726882560.85593: variable 'interface' from source: set_fact 19110 1726882560.85668: variable 'interface' from source: set_fact 19110 1726882560.85677: variable 'interface' from source: set_fact 19110 1726882560.85795: variable 'interface' from source: set_fact 19110 1726882560.86071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882560.86190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882560.86194: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882560.86317: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882560.86435: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882560.86483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882560.86523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882560.86546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882560.86573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882560.86658: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882560.87063: variable 'network_connections' from source: play vars 19110 1726882560.87071: variable 'interface' from source: set_fact 19110 1726882560.87127: variable 'interface' from source: set_fact 19110 1726882560.87133: variable 'interface' from source: set_fact 19110 1726882560.87262: variable 'interface' from source: set_fact 19110 1726882560.87318: Evaluated conditional (__network_wpa_supplicant_required): False 19110 1726882560.87322: when evaluation is False, skipping this task 19110 1726882560.87324: _execute() done 19110 1726882560.87333: dumping result to json 19110 1726882560.87336: done dumping result, returning 19110 1726882560.87338: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-5372-c19a-000000000026] 19110 1726882560.87344: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000026 19110 1726882560.87450: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000026 19110 1726882560.87454: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19110 1726882560.87504: no more pending results, returning what we have 19110 1726882560.87508: results queue empty 19110 1726882560.87509: checking for any_errors_fatal 19110 1726882560.87530: done checking for any_errors_fatal 19110 1726882560.87531: checking for max_fail_percentage 19110 1726882560.87533: done checking for max_fail_percentage 19110 1726882560.87534: checking to see if all hosts have failed and the running result is not ok 19110 1726882560.87534: done checking to see if all hosts have failed 19110 1726882560.87535: getting the remaining hosts for this loop 19110 1726882560.87537: done getting the remaining hosts for this loop 19110 1726882560.87540: getting the next task for host managed_node1 19110 1726882560.87546: done getting next task for host managed_node1 19110 1726882560.87550: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882560.87552: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882560.87567: getting variables 19110 1726882560.87587: in VariableManager get_vars() 19110 1726882560.87632: Calling all_inventory to load vars for managed_node1 19110 1726882560.87635: Calling groups_inventory to load vars for managed_node1 19110 1726882560.87638: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882560.87650: Calling all_plugins_play to load vars for managed_node1 19110 1726882560.87653: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882560.87656: Calling groups_plugins_play to load vars for managed_node1 19110 1726882560.89982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882560.91857: done with get_vars() 19110 1726882560.91884: done getting variables 19110 1726882560.91943: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:00 -0400 (0:00:00.133) 0:00:17.776 ****** 19110 1726882560.91974: entering _queue_task() for managed_node1/service 19110 1726882560.92355: worker is 1 (out of 1 available) 19110 1726882560.92406: exiting _queue_task() for managed_node1/service 19110 1726882560.92419: done queuing things up, now waiting for results queue to drain 19110 1726882560.92421: waiting for pending results... 19110 1726882560.92777: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882560.92926: in run() - task 0e448fcc-3ce9-5372-c19a-000000000027 19110 1726882560.92938: variable 'ansible_search_path' from source: unknown 19110 1726882560.92941: variable 'ansible_search_path' from source: unknown 19110 1726882560.92981: calling self._execute() 19110 1726882560.93469: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882560.93473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882560.93475: variable 'omit' from source: magic vars 19110 1726882560.94036: variable 'ansible_distribution_major_version' from source: facts 19110 1726882560.94047: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882560.94150: variable 'network_provider' from source: set_fact 19110 1726882560.94158: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882560.94161: when evaluation is False, skipping this task 19110 1726882560.94166: _execute() done 19110 1726882560.94169: dumping result to json 19110 1726882560.94171: done dumping result, returning 19110 1726882560.94176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-5372-c19a-000000000027] 19110 1726882560.94183: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000027 19110 1726882560.94279: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000027 19110 1726882560.94282: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882560.94338: no more pending results, returning what we have 19110 1726882560.94342: results queue empty 19110 1726882560.94344: checking for any_errors_fatal 19110 1726882560.94354: done checking for any_errors_fatal 19110 1726882560.94355: checking for max_fail_percentage 19110 1726882560.94356: done checking for max_fail_percentage 19110 1726882560.94358: checking to see if all hosts have failed and the running result is not ok 19110 1726882560.94358: done checking to see if all hosts have failed 19110 1726882560.94359: getting the remaining hosts for this loop 19110 1726882560.94361: done getting the remaining hosts for this loop 19110 1726882560.94366: getting the next task for host managed_node1 19110 1726882560.94373: done getting next task for host managed_node1 19110 1726882560.94377: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882560.94380: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882560.94395: getting variables 19110 1726882560.94397: in VariableManager get_vars() 19110 1726882560.94431: Calling all_inventory to load vars for managed_node1 19110 1726882560.94434: Calling groups_inventory to load vars for managed_node1 19110 1726882560.94436: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882560.94447: Calling all_plugins_play to load vars for managed_node1 19110 1726882560.94449: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882560.94452: Calling groups_plugins_play to load vars for managed_node1 19110 1726882560.97485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882561.02183: done with get_vars() 19110 1726882561.02333: done getting variables 19110 1726882561.02419: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:01 -0400 (0:00:00.105) 0:00:17.882 ****** 19110 1726882561.02561: entering _queue_task() for managed_node1/copy 19110 1726882561.03550: worker is 1 (out of 1 available) 19110 1726882561.03562: exiting _queue_task() for managed_node1/copy 19110 1726882561.03577: done queuing things up, now waiting for results queue to drain 19110 1726882561.03578: waiting for pending results... 19110 1726882561.04083: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882561.04088: in run() - task 0e448fcc-3ce9-5372-c19a-000000000028 19110 1726882561.04092: variable 'ansible_search_path' from source: unknown 19110 1726882561.04095: variable 'ansible_search_path' from source: unknown 19110 1726882561.04098: calling self._execute() 19110 1726882561.04101: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882561.04103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882561.04109: variable 'omit' from source: magic vars 19110 1726882561.04902: variable 'ansible_distribution_major_version' from source: facts 19110 1726882561.04974: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882561.05192: variable 'network_provider' from source: set_fact 19110 1726882561.05198: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882561.05200: when evaluation is False, skipping this task 19110 1726882561.05206: _execute() done 19110 1726882561.05209: dumping result to json 19110 1726882561.05229: done dumping result, returning 19110 1726882561.05240: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-5372-c19a-000000000028] 19110 1726882561.05250: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000028 19110 1726882561.05433: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000028 19110 1726882561.05437: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19110 1726882561.05531: no more pending results, returning what we have 19110 1726882561.05552: results queue empty 19110 1726882561.05553: checking for any_errors_fatal 19110 1726882561.05559: done checking for any_errors_fatal 19110 1726882561.05560: checking for max_fail_percentage 19110 1726882561.05561: done checking for max_fail_percentage 19110 1726882561.05562: checking to see if all hosts have failed and the running result is not ok 19110 1726882561.05565: done checking to see if all hosts have failed 19110 1726882561.05566: getting the remaining hosts for this loop 19110 1726882561.05568: done getting the remaining hosts for this loop 19110 1726882561.05571: getting the next task for host managed_node1 19110 1726882561.05606: done getting next task for host managed_node1 19110 1726882561.05611: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882561.05614: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882561.05646: getting variables 19110 1726882561.05649: in VariableManager get_vars() 19110 1726882561.05710: Calling all_inventory to load vars for managed_node1 19110 1726882561.05714: Calling groups_inventory to load vars for managed_node1 19110 1726882561.05717: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882561.05765: Calling all_plugins_play to load vars for managed_node1 19110 1726882561.05769: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882561.05773: Calling groups_plugins_play to load vars for managed_node1 19110 1726882561.10528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882561.13332: done with get_vars() 19110 1726882561.13446: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:01 -0400 (0:00:00.110) 0:00:17.993 ****** 19110 1726882561.13605: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882561.13607: Creating lock for fedora.linux_system_roles.network_connections 19110 1726882561.14571: worker is 1 (out of 1 available) 19110 1726882561.14585: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882561.14638: done queuing things up, now waiting for results queue to drain 19110 1726882561.14640: waiting for pending results... 19110 1726882561.15262: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882561.15608: in run() - task 0e448fcc-3ce9-5372-c19a-000000000029 19110 1726882561.15682: variable 'ansible_search_path' from source: unknown 19110 1726882561.15689: variable 'ansible_search_path' from source: unknown 19110 1726882561.15917: calling self._execute() 19110 1726882561.16318: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882561.16323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882561.16332: variable 'omit' from source: magic vars 19110 1726882561.16735: variable 'ansible_distribution_major_version' from source: facts 19110 1726882561.16746: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882561.16751: variable 'omit' from source: magic vars 19110 1726882561.16796: variable 'omit' from source: magic vars 19110 1726882561.16961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882561.21783: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882561.22031: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882561.22137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882561.22243: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882561.22374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882561.22625: variable 'network_provider' from source: set_fact 19110 1726882561.23170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882561.23173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882561.23272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882561.23276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882561.23326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882561.23457: variable 'omit' from source: magic vars 19110 1726882561.23948: variable 'omit' from source: magic vars 19110 1726882561.24082: variable 'network_connections' from source: play vars 19110 1726882561.24093: variable 'interface' from source: set_fact 19110 1726882561.24553: variable 'interface' from source: set_fact 19110 1726882561.24559: variable 'interface' from source: set_fact 19110 1726882561.24731: variable 'interface' from source: set_fact 19110 1726882561.24940: variable 'omit' from source: magic vars 19110 1726882561.24947: variable '__lsr_ansible_managed' from source: task vars 19110 1726882561.25007: variable '__lsr_ansible_managed' from source: task vars 19110 1726882561.25308: Loaded config def from plugin (lookup/template) 19110 1726882561.25312: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19110 1726882561.25343: File lookup term: get_ansible_managed.j2 19110 1726882561.25346: variable 'ansible_search_path' from source: unknown 19110 1726882561.25349: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19110 1726882561.25366: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19110 1726882561.25402: variable 'ansible_search_path' from source: unknown 19110 1726882561.33825: variable 'ansible_managed' from source: unknown 19110 1726882561.33962: variable 'omit' from source: magic vars 19110 1726882561.33990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882561.34017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882561.34035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882561.34051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882561.34062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882561.34097: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882561.34100: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882561.34103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882561.34200: Set connection var ansible_timeout to 10 19110 1726882561.34213: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882561.34218: Set connection var ansible_shell_executable to /bin/sh 19110 1726882561.34221: Set connection var ansible_shell_type to sh 19110 1726882561.34223: Set connection var ansible_connection to ssh 19110 1726882561.34229: Set connection var ansible_pipelining to False 19110 1726882561.34251: variable 'ansible_shell_executable' from source: unknown 19110 1726882561.34257: variable 'ansible_connection' from source: unknown 19110 1726882561.34260: variable 'ansible_module_compression' from source: unknown 19110 1726882561.34262: variable 'ansible_shell_type' from source: unknown 19110 1726882561.34271: variable 'ansible_shell_executable' from source: unknown 19110 1726882561.34273: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882561.34275: variable 'ansible_pipelining' from source: unknown 19110 1726882561.34278: variable 'ansible_timeout' from source: unknown 19110 1726882561.34280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882561.34409: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882561.34420: variable 'omit' from source: magic vars 19110 1726882561.34425: starting attempt loop 19110 1726882561.34428: running the handler 19110 1726882561.34441: _low_level_execute_command(): starting 19110 1726882561.34448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882561.35161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882561.35174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.35184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.35196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.35234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.35240: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882561.35250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.35262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882561.35273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882561.35284: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882561.35291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.35300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.35310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.35316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.35322: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882561.35331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.35406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882561.35421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882561.35424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882561.35557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882561.37236: stdout chunk (state=3): >>>/root <<< 19110 1726882561.37339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882561.37421: stderr chunk (state=3): >>><<< 19110 1726882561.37427: stdout chunk (state=3): >>><<< 19110 1726882561.37457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882561.37470: _low_level_execute_command(): starting 19110 1726882561.37476: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687 `" && echo ansible-tmp-1726882561.3745568-19936-4860776705687="` echo /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687 `" ) && sleep 0' 19110 1726882561.38196: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882561.38208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.38219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.38234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.38274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.38290: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882561.38300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.38315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882561.38322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882561.38333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882561.38341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.38350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.38361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.38371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.38378: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882561.38387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.38461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882561.38480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882561.38492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882561.38618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882561.40485: stdout chunk (state=3): >>>ansible-tmp-1726882561.3745568-19936-4860776705687=/root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687 <<< 19110 1726882561.40780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882561.40827: stderr chunk (state=3): >>><<< 19110 1726882561.40831: stdout chunk (state=3): >>><<< 19110 1726882561.40850: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882561.3745568-19936-4860776705687=/root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882561.40899: variable 'ansible_module_compression' from source: unknown 19110 1726882561.40945: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 19110 1726882561.40949: ANSIBALLZ: Acquiring lock 19110 1726882561.40951: ANSIBALLZ: Lock acquired: 139855662696208 19110 1726882561.40954: ANSIBALLZ: Creating module 19110 1726882561.64930: ANSIBALLZ: Writing module into payload 19110 1726882561.65377: ANSIBALLZ: Writing module 19110 1726882561.65407: ANSIBALLZ: Renaming module 19110 1726882561.65412: ANSIBALLZ: Done creating module 19110 1726882561.65438: variable 'ansible_facts' from source: unknown 19110 1726882561.65522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/AnsiballZ_network_connections.py 19110 1726882561.65660: Sending initial data 19110 1726882561.65663: Sent initial data (166 bytes) 19110 1726882561.66603: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882561.66612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.66622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.66636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.66679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.66686: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882561.66696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.66710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882561.66717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882561.66724: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882561.66732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.66742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.66758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.66761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.66766: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882561.66783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.66853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882561.66871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882561.66885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882561.67031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882561.68870: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882561.68967: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882561.69069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpqo7k5azs /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/AnsiballZ_network_connections.py <<< 19110 1726882561.69160: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882561.71014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882561.71109: stderr chunk (state=3): >>><<< 19110 1726882561.71113: stdout chunk (state=3): >>><<< 19110 1726882561.71136: done transferring module to remote 19110 1726882561.71146: _low_level_execute_command(): starting 19110 1726882561.71151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/ /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/AnsiballZ_network_connections.py && sleep 0' 19110 1726882561.71832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882561.71841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.71852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.71868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.71911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.71919: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882561.71929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.71943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882561.71951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882561.71960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882561.71966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.71978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.71991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.72002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.72010: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882561.72017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.72089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882561.72108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882561.72123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882561.72240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882561.74270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882561.74273: stdout chunk (state=3): >>><<< 19110 1726882561.74275: stderr chunk (state=3): >>><<< 19110 1726882561.74277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882561.74283: _low_level_execute_command(): starting 19110 1726882561.74285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/AnsiballZ_network_connections.py && sleep 0' 19110 1726882561.74867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882561.74875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.74885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.74898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.74933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.74941: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882561.74950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.74962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882561.74971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882561.74977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882561.74985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882561.74993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882561.75004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882561.75010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882561.75016: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882561.75025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882561.75098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882561.75111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882561.75122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882561.75246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.01599: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19110 1726882562.03890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882562.03939: stderr chunk (state=3): >>><<< 19110 1726882562.03942: stdout chunk (state=3): >>><<< 19110 1726882562.03962: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882562.04014: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882562.04023: _low_level_execute_command(): starting 19110 1726882562.04028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882561.3745568-19936-4860776705687/ > /dev/null 2>&1 && sleep 0' 19110 1726882562.04725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882562.04733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.04743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.04767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.04808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.04815: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882562.04824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.04836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882562.04843: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882562.04849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882562.04859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.04876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.04891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.04897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.04903: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882562.04912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.04994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882562.05012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.05023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.05139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.06962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.07029: stderr chunk (state=3): >>><<< 19110 1726882562.07037: stdout chunk (state=3): >>><<< 19110 1726882562.07063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882562.07071: handler run complete 19110 1726882562.07104: attempt loop complete, returning result 19110 1726882562.07107: _execute() done 19110 1726882562.07110: dumping result to json 19110 1726882562.07115: done dumping result, returning 19110 1726882562.07125: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-5372-c19a-000000000029] 19110 1726882562.07130: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000029 19110 1726882562.07245: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000029 19110 1726882562.07248: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active) 19110 1726882562.07382: no more pending results, returning what we have 19110 1726882562.07386: results queue empty 19110 1726882562.07387: checking for any_errors_fatal 19110 1726882562.07394: done checking for any_errors_fatal 19110 1726882562.07395: checking for max_fail_percentage 19110 1726882562.07397: done checking for max_fail_percentage 19110 1726882562.07398: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.07399: done checking to see if all hosts have failed 19110 1726882562.07400: getting the remaining hosts for this loop 19110 1726882562.07401: done getting the remaining hosts for this loop 19110 1726882562.07405: getting the next task for host managed_node1 19110 1726882562.07411: done getting next task for host managed_node1 19110 1726882562.07415: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882562.07417: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.07428: getting variables 19110 1726882562.07430: in VariableManager get_vars() 19110 1726882562.07473: Calling all_inventory to load vars for managed_node1 19110 1726882562.07476: Calling groups_inventory to load vars for managed_node1 19110 1726882562.07479: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.07490: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.07493: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.07496: Calling groups_plugins_play to load vars for managed_node1 19110 1726882562.09550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882562.11726: done with get_vars() 19110 1726882562.11753: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:02 -0400 (0:00:00.982) 0:00:18.975 ****** 19110 1726882562.11843: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882562.11845: Creating lock for fedora.linux_system_roles.network_state 19110 1726882562.12197: worker is 1 (out of 1 available) 19110 1726882562.12211: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882562.12224: done queuing things up, now waiting for results queue to drain 19110 1726882562.12226: waiting for pending results... 19110 1726882562.12525: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882562.12640: in run() - task 0e448fcc-3ce9-5372-c19a-00000000002a 19110 1726882562.12669: variable 'ansible_search_path' from source: unknown 19110 1726882562.12683: variable 'ansible_search_path' from source: unknown 19110 1726882562.12727: calling self._execute() 19110 1726882562.12833: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.12845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.12861: variable 'omit' from source: magic vars 19110 1726882562.13270: variable 'ansible_distribution_major_version' from source: facts 19110 1726882562.13287: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882562.13418: variable 'network_state' from source: role '' defaults 19110 1726882562.13441: Evaluated conditional (network_state != {}): False 19110 1726882562.13449: when evaluation is False, skipping this task 19110 1726882562.13461: _execute() done 19110 1726882562.13472: dumping result to json 19110 1726882562.13481: done dumping result, returning 19110 1726882562.13492: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-5372-c19a-00000000002a] 19110 1726882562.13504: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002a skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882562.13667: no more pending results, returning what we have 19110 1726882562.13671: results queue empty 19110 1726882562.13673: checking for any_errors_fatal 19110 1726882562.13685: done checking for any_errors_fatal 19110 1726882562.13686: checking for max_fail_percentage 19110 1726882562.13688: done checking for max_fail_percentage 19110 1726882562.13689: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.13690: done checking to see if all hosts have failed 19110 1726882562.13690: getting the remaining hosts for this loop 19110 1726882562.13692: done getting the remaining hosts for this loop 19110 1726882562.13696: getting the next task for host managed_node1 19110 1726882562.13703: done getting next task for host managed_node1 19110 1726882562.13707: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882562.13710: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.13725: getting variables 19110 1726882562.13727: in VariableManager get_vars() 19110 1726882562.13769: Calling all_inventory to load vars for managed_node1 19110 1726882562.13773: Calling groups_inventory to load vars for managed_node1 19110 1726882562.13775: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.13789: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.13792: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.13795: Calling groups_plugins_play to load vars for managed_node1 19110 1726882562.14804: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002a 19110 1726882562.14807: WORKER PROCESS EXITING 19110 1726882562.15578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882562.17373: done with get_vars() 19110 1726882562.17394: done getting variables 19110 1726882562.17459: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:02 -0400 (0:00:00.056) 0:00:19.032 ****** 19110 1726882562.17491: entering _queue_task() for managed_node1/debug 19110 1726882562.17796: worker is 1 (out of 1 available) 19110 1726882562.17807: exiting _queue_task() for managed_node1/debug 19110 1726882562.17820: done queuing things up, now waiting for results queue to drain 19110 1726882562.17822: waiting for pending results... 19110 1726882562.18114: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882562.18240: in run() - task 0e448fcc-3ce9-5372-c19a-00000000002b 19110 1726882562.18273: variable 'ansible_search_path' from source: unknown 19110 1726882562.18282: variable 'ansible_search_path' from source: unknown 19110 1726882562.18329: calling self._execute() 19110 1726882562.18437: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.18448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.18466: variable 'omit' from source: magic vars 19110 1726882562.18888: variable 'ansible_distribution_major_version' from source: facts 19110 1726882562.18907: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882562.18923: variable 'omit' from source: magic vars 19110 1726882562.18973: variable 'omit' from source: magic vars 19110 1726882562.19014: variable 'omit' from source: magic vars 19110 1726882562.19073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882562.19112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882562.19142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882562.19170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.19191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.19226: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882562.19235: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.19247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.19366: Set connection var ansible_timeout to 10 19110 1726882562.19386: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882562.19402: Set connection var ansible_shell_executable to /bin/sh 19110 1726882562.19409: Set connection var ansible_shell_type to sh 19110 1726882562.19414: Set connection var ansible_connection to ssh 19110 1726882562.19424: Set connection var ansible_pipelining to False 19110 1726882562.19450: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.19466: variable 'ansible_connection' from source: unknown 19110 1726882562.19477: variable 'ansible_module_compression' from source: unknown 19110 1726882562.19484: variable 'ansible_shell_type' from source: unknown 19110 1726882562.19490: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.19496: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.19508: variable 'ansible_pipelining' from source: unknown 19110 1726882562.19515: variable 'ansible_timeout' from source: unknown 19110 1726882562.19521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.19675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882562.19696: variable 'omit' from source: magic vars 19110 1726882562.19707: starting attempt loop 19110 1726882562.19714: running the handler 19110 1726882562.19860: variable '__network_connections_result' from source: set_fact 19110 1726882562.19927: handler run complete 19110 1726882562.19959: attempt loop complete, returning result 19110 1726882562.19970: _execute() done 19110 1726882562.19978: dumping result to json 19110 1726882562.19986: done dumping result, returning 19110 1726882562.19999: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000002b] 19110 1726882562.20014: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002b ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active)" ] } 19110 1726882562.20183: no more pending results, returning what we have 19110 1726882562.20187: results queue empty 19110 1726882562.20188: checking for any_errors_fatal 19110 1726882562.20196: done checking for any_errors_fatal 19110 1726882562.20197: checking for max_fail_percentage 19110 1726882562.20199: done checking for max_fail_percentage 19110 1726882562.20200: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.20201: done checking to see if all hosts have failed 19110 1726882562.20202: getting the remaining hosts for this loop 19110 1726882562.20203: done getting the remaining hosts for this loop 19110 1726882562.20207: getting the next task for host managed_node1 19110 1726882562.20214: done getting next task for host managed_node1 19110 1726882562.20218: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882562.20222: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.20232: getting variables 19110 1726882562.20234: in VariableManager get_vars() 19110 1726882562.20279: Calling all_inventory to load vars for managed_node1 19110 1726882562.20283: Calling groups_inventory to load vars for managed_node1 19110 1726882562.20286: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.20296: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.20300: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.20304: Calling groups_plugins_play to load vars for managed_node1 19110 1726882562.21303: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002b 19110 1726882562.21306: WORKER PROCESS EXITING 19110 1726882562.22205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882562.23968: done with get_vars() 19110 1726882562.23993: done getting variables 19110 1726882562.24065: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:02 -0400 (0:00:00.066) 0:00:19.098 ****** 19110 1726882562.24098: entering _queue_task() for managed_node1/debug 19110 1726882562.24433: worker is 1 (out of 1 available) 19110 1726882562.24448: exiting _queue_task() for managed_node1/debug 19110 1726882562.24465: done queuing things up, now waiting for results queue to drain 19110 1726882562.24467: waiting for pending results... 19110 1726882562.24762: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882562.24898: in run() - task 0e448fcc-3ce9-5372-c19a-00000000002c 19110 1726882562.24926: variable 'ansible_search_path' from source: unknown 19110 1726882562.24935: variable 'ansible_search_path' from source: unknown 19110 1726882562.24981: calling self._execute() 19110 1726882562.25085: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.25097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.25110: variable 'omit' from source: magic vars 19110 1726882562.25507: variable 'ansible_distribution_major_version' from source: facts 19110 1726882562.25527: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882562.25539: variable 'omit' from source: magic vars 19110 1726882562.25590: variable 'omit' from source: magic vars 19110 1726882562.25630: variable 'omit' from source: magic vars 19110 1726882562.25685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882562.25725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882562.25751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882562.25784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.25802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.25837: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882562.25847: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.25858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.25973: Set connection var ansible_timeout to 10 19110 1726882562.25990: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882562.26004: Set connection var ansible_shell_executable to /bin/sh 19110 1726882562.26010: Set connection var ansible_shell_type to sh 19110 1726882562.26016: Set connection var ansible_connection to ssh 19110 1726882562.26023: Set connection var ansible_pipelining to False 19110 1726882562.26046: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.26052: variable 'ansible_connection' from source: unknown 19110 1726882562.26060: variable 'ansible_module_compression' from source: unknown 19110 1726882562.26067: variable 'ansible_shell_type' from source: unknown 19110 1726882562.26072: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.26078: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.26084: variable 'ansible_pipelining' from source: unknown 19110 1726882562.26089: variable 'ansible_timeout' from source: unknown 19110 1726882562.26095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.26237: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882562.26253: variable 'omit' from source: magic vars 19110 1726882562.26267: starting attempt loop 19110 1726882562.26273: running the handler 19110 1726882562.26326: variable '__network_connections_result' from source: set_fact 19110 1726882562.26416: variable '__network_connections_result' from source: set_fact 19110 1726882562.26558: handler run complete 19110 1726882562.26593: attempt loop complete, returning result 19110 1726882562.26600: _execute() done 19110 1726882562.26605: dumping result to json 19110 1726882562.26613: done dumping result, returning 19110 1726882562.26623: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000002c] 19110 1726882562.26633: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002c 19110 1726882562.26745: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002c ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 264de621-b20e-42af-8432-5f491fad83e4 (not-active)" ] } } 19110 1726882562.26834: no more pending results, returning what we have 19110 1726882562.26838: results queue empty 19110 1726882562.26839: checking for any_errors_fatal 19110 1726882562.26847: done checking for any_errors_fatal 19110 1726882562.26848: checking for max_fail_percentage 19110 1726882562.26849: done checking for max_fail_percentage 19110 1726882562.26851: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.26851: done checking to see if all hosts have failed 19110 1726882562.26852: getting the remaining hosts for this loop 19110 1726882562.26853: done getting the remaining hosts for this loop 19110 1726882562.26860: getting the next task for host managed_node1 19110 1726882562.26869: done getting next task for host managed_node1 19110 1726882562.26873: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882562.26875: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.26886: getting variables 19110 1726882562.26888: in VariableManager get_vars() 19110 1726882562.26928: Calling all_inventory to load vars for managed_node1 19110 1726882562.26932: Calling groups_inventory to load vars for managed_node1 19110 1726882562.26935: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.26945: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.26948: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.26951: Calling groups_plugins_play to load vars for managed_node1 19110 1726882562.27904: WORKER PROCESS EXITING 19110 1726882562.28913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882562.30699: done with get_vars() 19110 1726882562.30725: done getting variables 19110 1726882562.30796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:02 -0400 (0:00:00.067) 0:00:19.165 ****** 19110 1726882562.30826: entering _queue_task() for managed_node1/debug 19110 1726882562.31169: worker is 1 (out of 1 available) 19110 1726882562.31187: exiting _queue_task() for managed_node1/debug 19110 1726882562.31200: done queuing things up, now waiting for results queue to drain 19110 1726882562.31202: waiting for pending results... 19110 1726882562.31493: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882562.31611: in run() - task 0e448fcc-3ce9-5372-c19a-00000000002d 19110 1726882562.31635: variable 'ansible_search_path' from source: unknown 19110 1726882562.31645: variable 'ansible_search_path' from source: unknown 19110 1726882562.31687: calling self._execute() 19110 1726882562.31783: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.31793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.31804: variable 'omit' from source: magic vars 19110 1726882562.32192: variable 'ansible_distribution_major_version' from source: facts 19110 1726882562.32209: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882562.32329: variable 'network_state' from source: role '' defaults 19110 1726882562.32343: Evaluated conditional (network_state != {}): False 19110 1726882562.32349: when evaluation is False, skipping this task 19110 1726882562.32359: _execute() done 19110 1726882562.32368: dumping result to json 19110 1726882562.32381: done dumping result, returning 19110 1726882562.32397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-5372-c19a-00000000002d] 19110 1726882562.32409: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002d skipping: [managed_node1] => { "false_condition": "network_state != {}" } 19110 1726882562.32557: no more pending results, returning what we have 19110 1726882562.32562: results queue empty 19110 1726882562.32565: checking for any_errors_fatal 19110 1726882562.32573: done checking for any_errors_fatal 19110 1726882562.32573: checking for max_fail_percentage 19110 1726882562.32576: done checking for max_fail_percentage 19110 1726882562.32576: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.32577: done checking to see if all hosts have failed 19110 1726882562.32578: getting the remaining hosts for this loop 19110 1726882562.32580: done getting the remaining hosts for this loop 19110 1726882562.32584: getting the next task for host managed_node1 19110 1726882562.32589: done getting next task for host managed_node1 19110 1726882562.32594: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882562.32597: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.32612: getting variables 19110 1726882562.32615: in VariableManager get_vars() 19110 1726882562.32654: Calling all_inventory to load vars for managed_node1 19110 1726882562.32660: Calling groups_inventory to load vars for managed_node1 19110 1726882562.32666: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.32679: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.32683: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.32686: Calling groups_plugins_play to load vars for managed_node1 19110 1726882562.33704: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002d 19110 1726882562.33707: WORKER PROCESS EXITING 19110 1726882562.34461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882562.36238: done with get_vars() 19110 1726882562.36266: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:02 -0400 (0:00:00.055) 0:00:19.220 ****** 19110 1726882562.36367: entering _queue_task() for managed_node1/ping 19110 1726882562.36369: Creating lock for ping 19110 1726882562.36682: worker is 1 (out of 1 available) 19110 1726882562.36694: exiting _queue_task() for managed_node1/ping 19110 1726882562.36706: done queuing things up, now waiting for results queue to drain 19110 1726882562.36708: waiting for pending results... 19110 1726882562.36996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882562.37110: in run() - task 0e448fcc-3ce9-5372-c19a-00000000002e 19110 1726882562.37132: variable 'ansible_search_path' from source: unknown 19110 1726882562.37138: variable 'ansible_search_path' from source: unknown 19110 1726882562.37183: calling self._execute() 19110 1726882562.37278: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.37288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.37298: variable 'omit' from source: magic vars 19110 1726882562.37674: variable 'ansible_distribution_major_version' from source: facts 19110 1726882562.37695: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882562.37705: variable 'omit' from source: magic vars 19110 1726882562.37745: variable 'omit' from source: magic vars 19110 1726882562.37786: variable 'omit' from source: magic vars 19110 1726882562.37829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882562.37871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882562.37905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882562.37928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.37944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882562.37984: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882562.37996: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.38004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.38117: Set connection var ansible_timeout to 10 19110 1726882562.38138: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882562.38149: Set connection var ansible_shell_executable to /bin/sh 19110 1726882562.38158: Set connection var ansible_shell_type to sh 19110 1726882562.38168: Set connection var ansible_connection to ssh 19110 1726882562.38179: Set connection var ansible_pipelining to False 19110 1726882562.38205: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.38217: variable 'ansible_connection' from source: unknown 19110 1726882562.38227: variable 'ansible_module_compression' from source: unknown 19110 1726882562.38237: variable 'ansible_shell_type' from source: unknown 19110 1726882562.38244: variable 'ansible_shell_executable' from source: unknown 19110 1726882562.38250: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882562.38261: variable 'ansible_pipelining' from source: unknown 19110 1726882562.38271: variable 'ansible_timeout' from source: unknown 19110 1726882562.38279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882562.38502: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882562.38518: variable 'omit' from source: magic vars 19110 1726882562.38528: starting attempt loop 19110 1726882562.38540: running the handler 19110 1726882562.38565: _low_level_execute_command(): starting 19110 1726882562.38579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882562.39390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882562.39411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.39430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.39448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.39499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.39514: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882562.39535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.39557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882562.39572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882562.39582: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882562.39593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.39606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.39622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.39639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.39652: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882562.39673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.39751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882562.39774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.39788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.39973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.41599: stdout chunk (state=3): >>>/root <<< 19110 1726882562.41802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.41805: stdout chunk (state=3): >>><<< 19110 1726882562.41807: stderr chunk (state=3): >>><<< 19110 1726882562.41928: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882562.41932: _low_level_execute_command(): starting 19110 1726882562.41936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540 `" && echo ansible-tmp-1726882562.4182754-19980-188173112127540="` echo /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540 `" ) && sleep 0' 19110 1726882562.42535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882562.42549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.42571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.42590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.42633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.42643: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882562.42659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.42682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882562.42694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882562.42703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882562.42714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.42725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.42737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.42747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.42760: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882562.42775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.42857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882562.42881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.42900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.43029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.44901: stdout chunk (state=3): >>>ansible-tmp-1726882562.4182754-19980-188173112127540=/root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540 <<< 19110 1726882562.45102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.45105: stdout chunk (state=3): >>><<< 19110 1726882562.45108: stderr chunk (state=3): >>><<< 19110 1726882562.45375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882562.4182754-19980-188173112127540=/root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882562.45378: variable 'ansible_module_compression' from source: unknown 19110 1726882562.45381: ANSIBALLZ: Using lock for ping 19110 1726882562.45383: ANSIBALLZ: Acquiring lock 19110 1726882562.45385: ANSIBALLZ: Lock acquired: 139855634441824 19110 1726882562.45386: ANSIBALLZ: Creating module 19110 1726882562.68631: ANSIBALLZ: Writing module into payload 19110 1726882562.68724: ANSIBALLZ: Writing module 19110 1726882562.68761: ANSIBALLZ: Renaming module 19110 1726882562.68773: ANSIBALLZ: Done creating module 19110 1726882562.68799: variable 'ansible_facts' from source: unknown 19110 1726882562.68881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/AnsiballZ_ping.py 19110 1726882562.69042: Sending initial data 19110 1726882562.69045: Sent initial data (153 bytes) 19110 1726882562.71666: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.71670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.71703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.71706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.71709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.71785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.71788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.72012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.73886: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882562.73981: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882562.74088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp3itddxzk /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/AnsiballZ_ping.py <<< 19110 1726882562.74189: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882562.75559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.75650: stderr chunk (state=3): >>><<< 19110 1726882562.75653: stdout chunk (state=3): >>><<< 19110 1726882562.75680: done transferring module to remote 19110 1726882562.75692: _low_level_execute_command(): starting 19110 1726882562.75695: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/ /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/AnsiballZ_ping.py && sleep 0' 19110 1726882562.76780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.76977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.77028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.78862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.78868: stderr chunk (state=3): >>><<< 19110 1726882562.78871: stdout chunk (state=3): >>><<< 19110 1726882562.78889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882562.78892: _low_level_execute_command(): starting 19110 1726882562.78897: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/AnsiballZ_ping.py && sleep 0' 19110 1726882562.80116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882562.80128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.80140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.80157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.80207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.80217: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882562.80228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.80242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882562.80252: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882562.80265: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882562.80276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.80292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.80309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.80318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882562.80328: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882562.80340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.80426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882562.80450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.80471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.80602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.93461: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19110 1726882562.94500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882562.94504: stdout chunk (state=3): >>><<< 19110 1726882562.94507: stderr chunk (state=3): >>><<< 19110 1726882562.94641: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882562.94645: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882562.94648: _low_level_execute_command(): starting 19110 1726882562.94650: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882562.4182754-19980-188173112127540/ > /dev/null 2>&1 && sleep 0' 19110 1726882562.95979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.96101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882562.96105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882562.96153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882562.96166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882562.96168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882562.96170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882562.96281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882562.96345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882562.96524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882562.98381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882562.98385: stdout chunk (state=3): >>><<< 19110 1726882562.98391: stderr chunk (state=3): >>><<< 19110 1726882562.98412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882562.98418: handler run complete 19110 1726882562.98435: attempt loop complete, returning result 19110 1726882562.98438: _execute() done 19110 1726882562.98440: dumping result to json 19110 1726882562.98443: done dumping result, returning 19110 1726882562.98453: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-5372-c19a-00000000002e] 19110 1726882562.98459: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002e 19110 1726882562.98553: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000002e 19110 1726882562.98558: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 19110 1726882562.98724: no more pending results, returning what we have 19110 1726882562.98727: results queue empty 19110 1726882562.98728: checking for any_errors_fatal 19110 1726882562.98733: done checking for any_errors_fatal 19110 1726882562.98734: checking for max_fail_percentage 19110 1726882562.98736: done checking for max_fail_percentage 19110 1726882562.98736: checking to see if all hosts have failed and the running result is not ok 19110 1726882562.98737: done checking to see if all hosts have failed 19110 1726882562.98738: getting the remaining hosts for this loop 19110 1726882562.98739: done getting the remaining hosts for this loop 19110 1726882562.98743: getting the next task for host managed_node1 19110 1726882562.98749: done getting next task for host managed_node1 19110 1726882562.98752: ^ task is: TASK: meta (role_complete) 19110 1726882562.98754: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882562.98768: getting variables 19110 1726882562.98770: in VariableManager get_vars() 19110 1726882562.98810: Calling all_inventory to load vars for managed_node1 19110 1726882562.98813: Calling groups_inventory to load vars for managed_node1 19110 1726882562.98815: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882562.98826: Calling all_plugins_play to load vars for managed_node1 19110 1726882562.98829: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882562.98832: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.01150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.02958: done with get_vars() 19110 1726882563.02989: done getting variables 19110 1726882563.03080: done queuing things up, now waiting for results queue to drain 19110 1726882563.03082: results queue empty 19110 1726882563.03083: checking for any_errors_fatal 19110 1726882563.03086: done checking for any_errors_fatal 19110 1726882563.03087: checking for max_fail_percentage 19110 1726882563.03087: done checking for max_fail_percentage 19110 1726882563.03088: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.03089: done checking to see if all hosts have failed 19110 1726882563.03090: getting the remaining hosts for this loop 19110 1726882563.03091: done getting the remaining hosts for this loop 19110 1726882563.03093: getting the next task for host managed_node1 19110 1726882563.03097: done getting next task for host managed_node1 19110 1726882563.03104: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 19110 1726882563.03106: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.03108: getting variables 19110 1726882563.03109: in VariableManager get_vars() 19110 1726882563.03121: Calling all_inventory to load vars for managed_node1 19110 1726882563.03123: Calling groups_inventory to load vars for managed_node1 19110 1726882563.03125: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.03130: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.03132: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.03135: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.04405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.11865: done with get_vars() 19110 1726882563.11912: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 21:36:03 -0400 (0:00:00.756) 0:00:19.977 ****** 19110 1726882563.11990: entering _queue_task() for managed_node1/include_tasks 19110 1726882563.12373: worker is 1 (out of 1 available) 19110 1726882563.12385: exiting _queue_task() for managed_node1/include_tasks 19110 1726882563.12396: done queuing things up, now waiting for results queue to drain 19110 1726882563.12398: waiting for pending results... 19110 1726882563.12681: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 19110 1726882563.12775: in run() - task 0e448fcc-3ce9-5372-c19a-000000000030 19110 1726882563.12791: variable 'ansible_search_path' from source: unknown 19110 1726882563.12834: calling self._execute() 19110 1726882563.12936: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.12940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.12953: variable 'omit' from source: magic vars 19110 1726882563.13334: variable 'ansible_distribution_major_version' from source: facts 19110 1726882563.13346: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882563.13353: _execute() done 19110 1726882563.13358: dumping result to json 19110 1726882563.13361: done dumping result, returning 19110 1726882563.13367: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0e448fcc-3ce9-5372-c19a-000000000030] 19110 1726882563.13375: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000030 19110 1726882563.13485: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000030 19110 1726882563.13489: WORKER PROCESS EXITING 19110 1726882563.13519: no more pending results, returning what we have 19110 1726882563.13525: in VariableManager get_vars() 19110 1726882563.13574: Calling all_inventory to load vars for managed_node1 19110 1726882563.13577: Calling groups_inventory to load vars for managed_node1 19110 1726882563.13580: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.13594: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.13597: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.13600: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.15226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.17009: done with get_vars() 19110 1726882563.17033: variable 'ansible_search_path' from source: unknown 19110 1726882563.17047: we have included files to process 19110 1726882563.17048: generating all_blocks data 19110 1726882563.17050: done generating all_blocks data 19110 1726882563.17056: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 19110 1726882563.17057: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 19110 1726882563.17059: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 19110 1726882563.17381: done processing included file 19110 1726882563.17384: iterating over new_blocks loaded from include file 19110 1726882563.17385: in VariableManager get_vars() 19110 1726882563.17399: done with get_vars() 19110 1726882563.17401: filtering new block on tags 19110 1726882563.17418: done filtering new block on tags 19110 1726882563.17420: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node1 19110 1726882563.17424: extending task lists for all hosts with included blocks 19110 1726882563.17453: done extending task lists 19110 1726882563.17454: done processing included files 19110 1726882563.17455: results queue empty 19110 1726882563.17456: checking for any_errors_fatal 19110 1726882563.17462: done checking for any_errors_fatal 19110 1726882563.17463: checking for max_fail_percentage 19110 1726882563.17464: done checking for max_fail_percentage 19110 1726882563.17465: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.17466: done checking to see if all hosts have failed 19110 1726882563.17466: getting the remaining hosts for this loop 19110 1726882563.17468: done getting the remaining hosts for this loop 19110 1726882563.17471: getting the next task for host managed_node1 19110 1726882563.17475: done getting next task for host managed_node1 19110 1726882563.17478: ^ task is: TASK: Assert that warnings is empty 19110 1726882563.17480: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.17482: getting variables 19110 1726882563.17483: in VariableManager get_vars() 19110 1726882563.17493: Calling all_inventory to load vars for managed_node1 19110 1726882563.17495: Calling groups_inventory to load vars for managed_node1 19110 1726882563.17498: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.17503: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.17505: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.17507: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.18873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.21732: done with get_vars() 19110 1726882563.21754: done getting variables 19110 1726882563.21798: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 21:36:03 -0400 (0:00:00.098) 0:00:20.075 ****** 19110 1726882563.21832: entering _queue_task() for managed_node1/assert 19110 1726882563.22192: worker is 1 (out of 1 available) 19110 1726882563.22204: exiting _queue_task() for managed_node1/assert 19110 1726882563.22215: done queuing things up, now waiting for results queue to drain 19110 1726882563.22216: waiting for pending results... 19110 1726882563.22515: running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty 19110 1726882563.22622: in run() - task 0e448fcc-3ce9-5372-c19a-000000000304 19110 1726882563.22635: variable 'ansible_search_path' from source: unknown 19110 1726882563.22639: variable 'ansible_search_path' from source: unknown 19110 1726882563.22680: calling self._execute() 19110 1726882563.22775: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.22782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.22793: variable 'omit' from source: magic vars 19110 1726882563.23186: variable 'ansible_distribution_major_version' from source: facts 19110 1726882563.23198: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882563.23208: variable 'omit' from source: magic vars 19110 1726882563.23251: variable 'omit' from source: magic vars 19110 1726882563.23287: variable 'omit' from source: magic vars 19110 1726882563.23330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882563.23378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882563.23398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882563.23414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.23430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.23466: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882563.23470: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.23472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.23575: Set connection var ansible_timeout to 10 19110 1726882563.23589: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882563.23592: Set connection var ansible_shell_executable to /bin/sh 19110 1726882563.23594: Set connection var ansible_shell_type to sh 19110 1726882563.23597: Set connection var ansible_connection to ssh 19110 1726882563.23605: Set connection var ansible_pipelining to False 19110 1726882563.23623: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.23627: variable 'ansible_connection' from source: unknown 19110 1726882563.23630: variable 'ansible_module_compression' from source: unknown 19110 1726882563.23632: variable 'ansible_shell_type' from source: unknown 19110 1726882563.23634: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.23637: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.23646: variable 'ansible_pipelining' from source: unknown 19110 1726882563.23649: variable 'ansible_timeout' from source: unknown 19110 1726882563.23653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.23796: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882563.23806: variable 'omit' from source: magic vars 19110 1726882563.23812: starting attempt loop 19110 1726882563.23815: running the handler 19110 1726882563.23948: variable '__network_connections_result' from source: set_fact 19110 1726882563.23961: Evaluated conditional ('warnings' not in __network_connections_result): True 19110 1726882563.23971: handler run complete 19110 1726882563.23987: attempt loop complete, returning result 19110 1726882563.23993: _execute() done 19110 1726882563.23996: dumping result to json 19110 1726882563.23998: done dumping result, returning 19110 1726882563.24005: done running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty [0e448fcc-3ce9-5372-c19a-000000000304] 19110 1726882563.24011: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000304 19110 1726882563.24102: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000304 19110 1726882563.24106: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 19110 1726882563.24154: no more pending results, returning what we have 19110 1726882563.24158: results queue empty 19110 1726882563.24159: checking for any_errors_fatal 19110 1726882563.24160: done checking for any_errors_fatal 19110 1726882563.24161: checking for max_fail_percentage 19110 1726882563.24164: done checking for max_fail_percentage 19110 1726882563.24167: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.24167: done checking to see if all hosts have failed 19110 1726882563.24168: getting the remaining hosts for this loop 19110 1726882563.24170: done getting the remaining hosts for this loop 19110 1726882563.24173: getting the next task for host managed_node1 19110 1726882563.24180: done getting next task for host managed_node1 19110 1726882563.24182: ^ task is: TASK: Assert that there is output in stderr 19110 1726882563.24186: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.24189: getting variables 19110 1726882563.24192: in VariableManager get_vars() 19110 1726882563.24231: Calling all_inventory to load vars for managed_node1 19110 1726882563.24234: Calling groups_inventory to load vars for managed_node1 19110 1726882563.24236: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.24247: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.24250: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.24255: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.25918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.27828: done with get_vars() 19110 1726882563.27849: done getting variables 19110 1726882563.27902: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 21:36:03 -0400 (0:00:00.060) 0:00:20.136 ****** 19110 1726882563.27928: entering _queue_task() for managed_node1/assert 19110 1726882563.28209: worker is 1 (out of 1 available) 19110 1726882563.28220: exiting _queue_task() for managed_node1/assert 19110 1726882563.28231: done queuing things up, now waiting for results queue to drain 19110 1726882563.28233: waiting for pending results... 19110 1726882563.28500: running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr 19110 1726882563.28592: in run() - task 0e448fcc-3ce9-5372-c19a-000000000305 19110 1726882563.28606: variable 'ansible_search_path' from source: unknown 19110 1726882563.28611: variable 'ansible_search_path' from source: unknown 19110 1726882563.28643: calling self._execute() 19110 1726882563.28736: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.28740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.28750: variable 'omit' from source: magic vars 19110 1726882563.29146: variable 'ansible_distribution_major_version' from source: facts 19110 1726882563.29161: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882563.29164: variable 'omit' from source: magic vars 19110 1726882563.29200: variable 'omit' from source: magic vars 19110 1726882563.29241: variable 'omit' from source: magic vars 19110 1726882563.29281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882563.29313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882563.29342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882563.29361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.29374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.29408: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882563.29412: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.29415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.29518: Set connection var ansible_timeout to 10 19110 1726882563.29529: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882563.29535: Set connection var ansible_shell_executable to /bin/sh 19110 1726882563.29538: Set connection var ansible_shell_type to sh 19110 1726882563.29540: Set connection var ansible_connection to ssh 19110 1726882563.29549: Set connection var ansible_pipelining to False 19110 1726882563.29575: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.29579: variable 'ansible_connection' from source: unknown 19110 1726882563.29582: variable 'ansible_module_compression' from source: unknown 19110 1726882563.29585: variable 'ansible_shell_type' from source: unknown 19110 1726882563.29587: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.29590: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.29592: variable 'ansible_pipelining' from source: unknown 19110 1726882563.29594: variable 'ansible_timeout' from source: unknown 19110 1726882563.29596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.29740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882563.29750: variable 'omit' from source: magic vars 19110 1726882563.29759: starting attempt loop 19110 1726882563.29762: running the handler 19110 1726882563.29901: variable '__network_connections_result' from source: set_fact 19110 1726882563.29921: Evaluated conditional ('stderr' in __network_connections_result): True 19110 1726882563.29927: handler run complete 19110 1726882563.29949: attempt loop complete, returning result 19110 1726882563.29952: _execute() done 19110 1726882563.29958: dumping result to json 19110 1726882563.29961: done dumping result, returning 19110 1726882563.29965: done running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr [0e448fcc-3ce9-5372-c19a-000000000305] 19110 1726882563.29970: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000305 19110 1726882563.30061: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000305 19110 1726882563.30065: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 19110 1726882563.30137: no more pending results, returning what we have 19110 1726882563.30141: results queue empty 19110 1726882563.30142: checking for any_errors_fatal 19110 1726882563.30149: done checking for any_errors_fatal 19110 1726882563.30150: checking for max_fail_percentage 19110 1726882563.30152: done checking for max_fail_percentage 19110 1726882563.30152: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.30153: done checking to see if all hosts have failed 19110 1726882563.30154: getting the remaining hosts for this loop 19110 1726882563.30157: done getting the remaining hosts for this loop 19110 1726882563.30160: getting the next task for host managed_node1 19110 1726882563.30170: done getting next task for host managed_node1 19110 1726882563.30172: ^ task is: TASK: meta (flush_handlers) 19110 1726882563.30174: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.30179: getting variables 19110 1726882563.30181: in VariableManager get_vars() 19110 1726882563.30218: Calling all_inventory to load vars for managed_node1 19110 1726882563.30221: Calling groups_inventory to load vars for managed_node1 19110 1726882563.30223: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.30234: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.30237: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.30241: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.32921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.35294: done with get_vars() 19110 1726882563.35324: done getting variables 19110 1726882563.35396: in VariableManager get_vars() 19110 1726882563.35410: Calling all_inventory to load vars for managed_node1 19110 1726882563.35413: Calling groups_inventory to load vars for managed_node1 19110 1726882563.35420: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.35426: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.35429: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.35431: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.36951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.39136: done with get_vars() 19110 1726882563.39172: done queuing things up, now waiting for results queue to drain 19110 1726882563.39175: results queue empty 19110 1726882563.39176: checking for any_errors_fatal 19110 1726882563.39179: done checking for any_errors_fatal 19110 1726882563.39180: checking for max_fail_percentage 19110 1726882563.39181: done checking for max_fail_percentage 19110 1726882563.39182: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.39187: done checking to see if all hosts have failed 19110 1726882563.39188: getting the remaining hosts for this loop 19110 1726882563.39194: done getting the remaining hosts for this loop 19110 1726882563.39197: getting the next task for host managed_node1 19110 1726882563.39201: done getting next task for host managed_node1 19110 1726882563.39203: ^ task is: TASK: meta (flush_handlers) 19110 1726882563.39205: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.39212: getting variables 19110 1726882563.39213: in VariableManager get_vars() 19110 1726882563.39225: Calling all_inventory to load vars for managed_node1 19110 1726882563.39227: Calling groups_inventory to load vars for managed_node1 19110 1726882563.39230: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.39235: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.39237: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.39240: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.40698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.42615: done with get_vars() 19110 1726882563.42634: done getting variables 19110 1726882563.42703: in VariableManager get_vars() 19110 1726882563.42715: Calling all_inventory to load vars for managed_node1 19110 1726882563.42717: Calling groups_inventory to load vars for managed_node1 19110 1726882563.42719: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.42726: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.42729: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.42731: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.45261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.47108: done with get_vars() 19110 1726882563.47137: done queuing things up, now waiting for results queue to drain 19110 1726882563.47145: results queue empty 19110 1726882563.47146: checking for any_errors_fatal 19110 1726882563.47147: done checking for any_errors_fatal 19110 1726882563.47148: checking for max_fail_percentage 19110 1726882563.47149: done checking for max_fail_percentage 19110 1726882563.47150: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.47151: done checking to see if all hosts have failed 19110 1726882563.47152: getting the remaining hosts for this loop 19110 1726882563.47153: done getting the remaining hosts for this loop 19110 1726882563.47158: getting the next task for host managed_node1 19110 1726882563.47162: done getting next task for host managed_node1 19110 1726882563.47162: ^ task is: None 19110 1726882563.47166: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.47167: done queuing things up, now waiting for results queue to drain 19110 1726882563.47168: results queue empty 19110 1726882563.47169: checking for any_errors_fatal 19110 1726882563.47170: done checking for any_errors_fatal 19110 1726882563.47170: checking for max_fail_percentage 19110 1726882563.47171: done checking for max_fail_percentage 19110 1726882563.47172: checking to see if all hosts have failed and the running result is not ok 19110 1726882563.47173: done checking to see if all hosts have failed 19110 1726882563.47174: getting the next task for host managed_node1 19110 1726882563.47176: done getting next task for host managed_node1 19110 1726882563.47177: ^ task is: None 19110 1726882563.47179: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.47230: in VariableManager get_vars() 19110 1726882563.47246: done with get_vars() 19110 1726882563.47260: in VariableManager get_vars() 19110 1726882563.47272: done with get_vars() 19110 1726882563.47277: variable 'omit' from source: magic vars 19110 1726882563.47313: in VariableManager get_vars() 19110 1726882563.47324: done with get_vars() 19110 1726882563.47346: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 19110 1726882563.47623: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882563.47653: getting the remaining hosts for this loop 19110 1726882563.47657: done getting the remaining hosts for this loop 19110 1726882563.47660: getting the next task for host managed_node1 19110 1726882563.47662: done getting next task for host managed_node1 19110 1726882563.47667: ^ task is: TASK: Gathering Facts 19110 1726882563.47668: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882563.47671: getting variables 19110 1726882563.47672: in VariableManager get_vars() 19110 1726882563.47680: Calling all_inventory to load vars for managed_node1 19110 1726882563.47682: Calling groups_inventory to load vars for managed_node1 19110 1726882563.47686: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882563.47691: Calling all_plugins_play to load vars for managed_node1 19110 1726882563.47693: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882563.47696: Calling groups_plugins_play to load vars for managed_node1 19110 1726882563.49341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882563.51326: done with get_vars() 19110 1726882563.51348: done getting variables 19110 1726882563.51407: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 21:36:03 -0400 (0:00:00.235) 0:00:20.371 ****** 19110 1726882563.51433: entering _queue_task() for managed_node1/gather_facts 19110 1726882563.51797: worker is 1 (out of 1 available) 19110 1726882563.51814: exiting _queue_task() for managed_node1/gather_facts 19110 1726882563.51828: done queuing things up, now waiting for results queue to drain 19110 1726882563.51830: waiting for pending results... 19110 1726882563.52124: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882563.52233: in run() - task 0e448fcc-3ce9-5372-c19a-000000000316 19110 1726882563.52269: variable 'ansible_search_path' from source: unknown 19110 1726882563.52312: calling self._execute() 19110 1726882563.52417: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.52428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.52440: variable 'omit' from source: magic vars 19110 1726882563.53016: variable 'ansible_distribution_major_version' from source: facts 19110 1726882563.53047: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882563.53109: variable 'omit' from source: magic vars 19110 1726882563.53153: variable 'omit' from source: magic vars 19110 1726882563.53323: variable 'omit' from source: magic vars 19110 1726882563.53489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882563.53667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882563.53860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882563.53910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.53979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882563.54046: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882563.54057: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.54068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.54174: Set connection var ansible_timeout to 10 19110 1726882563.54190: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882563.54198: Set connection var ansible_shell_executable to /bin/sh 19110 1726882563.54203: Set connection var ansible_shell_type to sh 19110 1726882563.54208: Set connection var ansible_connection to ssh 19110 1726882563.54218: Set connection var ansible_pipelining to False 19110 1726882563.54251: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.54270: variable 'ansible_connection' from source: unknown 19110 1726882563.54278: variable 'ansible_module_compression' from source: unknown 19110 1726882563.54284: variable 'ansible_shell_type' from source: unknown 19110 1726882563.54289: variable 'ansible_shell_executable' from source: unknown 19110 1726882563.54293: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882563.54298: variable 'ansible_pipelining' from source: unknown 19110 1726882563.54303: variable 'ansible_timeout' from source: unknown 19110 1726882563.54308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882563.54599: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882563.54616: variable 'omit' from source: magic vars 19110 1726882563.54626: starting attempt loop 19110 1726882563.54632: running the handler 19110 1726882563.54652: variable 'ansible_facts' from source: unknown 19110 1726882563.54687: _low_level_execute_command(): starting 19110 1726882563.54703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882563.56028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882563.56048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.56071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.56092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.56135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.56157: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882563.56175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.56195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882563.56208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882563.56220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882563.56232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.56248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.56275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.56289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.56301: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882563.56316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.56476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882563.56526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882563.56552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882563.56707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882563.58362: stdout chunk (state=3): >>>/root <<< 19110 1726882563.58595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882563.58600: stdout chunk (state=3): >>><<< 19110 1726882563.58602: stderr chunk (state=3): >>><<< 19110 1726882563.58746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882563.58750: _low_level_execute_command(): starting 19110 1726882563.58752: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103 `" && echo ansible-tmp-1726882563.5862393-20022-38122069629103="` echo /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103 `" ) && sleep 0' 19110 1726882563.59399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882563.59422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.59438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.59461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.59507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.59520: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882563.59539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.59561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882563.59576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882563.59588: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882563.59601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.59625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.59648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.59670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.59682: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882563.59697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.59778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882563.59800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882563.59817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882563.59941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882563.61791: stdout chunk (state=3): >>>ansible-tmp-1726882563.5862393-20022-38122069629103=/root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103 <<< 19110 1726882563.61902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882563.61972: stderr chunk (state=3): >>><<< 19110 1726882563.61975: stdout chunk (state=3): >>><<< 19110 1726882563.62397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882563.5862393-20022-38122069629103=/root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882563.62400: variable 'ansible_module_compression' from source: unknown 19110 1726882563.62402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882563.62405: variable 'ansible_facts' from source: unknown 19110 1726882563.62408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/AnsiballZ_setup.py 19110 1726882563.62487: Sending initial data 19110 1726882563.62490: Sent initial data (153 bytes) 19110 1726882563.63430: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882563.63443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.63457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.63476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.63514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.63525: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882563.63537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.63552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882563.63571: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882563.63581: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882563.63591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.63602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.63616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.63625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.63634: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882563.63645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.63722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882563.63738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882563.63751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882563.63874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882563.65620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882563.65729: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882563.65825: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp_70hgl41 /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/AnsiballZ_setup.py <<< 19110 1726882563.65917: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882563.69973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882563.70106: stderr chunk (state=3): >>><<< 19110 1726882563.70109: stdout chunk (state=3): >>><<< 19110 1726882563.70112: done transferring module to remote 19110 1726882563.70114: _low_level_execute_command(): starting 19110 1726882563.70120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/ /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/AnsiballZ_setup.py && sleep 0' 19110 1726882563.71778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882563.71790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.71802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.71816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.71860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.71877: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882563.71892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.71908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882563.71920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882563.71939: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882563.71973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.71979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.71994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.72037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.72051: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882563.72068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.72215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882563.72282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882563.72298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882563.72687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882563.74536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882563.74540: stdout chunk (state=3): >>><<< 19110 1726882563.74542: stderr chunk (state=3): >>><<< 19110 1726882563.74630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882563.74634: _low_level_execute_command(): starting 19110 1726882563.74637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/AnsiballZ_setup.py && sleep 0' 19110 1726882563.75335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882563.75363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.75382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.75402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.75451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.75485: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882563.75509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.75533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882563.75558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882563.75573: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882563.75590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882563.75613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882563.75634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882563.75647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882563.75669: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882563.75697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882563.75784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882563.75805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882563.75826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882563.75979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.29078: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_ke<<< 19110 1726882564.29090: stdout chunk (state=3): >>>ytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.47, "5m": 0.4, "15m": 0.22}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "04", "epoch": "1726882564", "epoch_int": "1726882564", "date": "2024-09-20", "time": "21:36:04", "iso8601_micro": "2024-09-21T01:36:04.009422Z", "iso8601": "2024-09-21T01:36:04Z", "iso8601_basic": "20240920T213604009422", "iso8601_basic_short": "20240920T213604", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions":<<< 19110 1726882564.29127: stdout chunk (state=3): >>> {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 722, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239267840, "block_size": 4096, "block_total": 65519355, "block_available": 64511540, "block_used": 1007815, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_interfaces": ["peerlsr27", "lsr27", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [<<< 19110 1726882564.29133: stdout chunk (state=3): >>>fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::8471:8085:75a2:367", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_off<<< 19110 1726882564.29167: stdout chunk (state=3): >>>load": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_ti<<< 19110 1726882564.29180: stdout chunk (state=3): >>>mestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::8471:8085:75a2:367", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::8471:8085:75a2:367"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882564.30770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882564.30825: stderr chunk (state=3): >>><<< 19110 1726882564.30828: stdout chunk (state=3): >>><<< 19110 1726882564.30872: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.47, "5m": 0.4, "15m": 0.22}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "04", "epoch": "1726882564", "epoch_int": "1726882564", "date": "2024-09-20", "time": "21:36:04", "iso8601_micro": "2024-09-21T01:36:04.009422Z", "iso8601": "2024-09-21T01:36:04Z", "iso8601_basic": "20240920T213604009422", "iso8601_basic_short": "20240920T213604", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 722, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239267840, "block_size": 4096, "block_total": 65519355, "block_available": 64511540, "block_used": 1007815, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_interfaces": ["peerlsr27", "lsr27", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::8471:8085:75a2:367", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::8471:8085:75a2:367", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::8471:8085:75a2:367"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882564.31148: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882564.31167: _low_level_execute_command(): starting 19110 1726882564.31170: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882563.5862393-20022-38122069629103/ > /dev/null 2>&1 && sleep 0' 19110 1726882564.31619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.31631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.31655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.31676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.31719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.31731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.31828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.33625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882564.33688: stderr chunk (state=3): >>><<< 19110 1726882564.33691: stdout chunk (state=3): >>><<< 19110 1726882564.33880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882564.33884: handler run complete 19110 1726882564.33886: variable 'ansible_facts' from source: unknown 19110 1726882564.33930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.34140: variable 'ansible_facts' from source: unknown 19110 1726882564.34203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.34295: attempt loop complete, returning result 19110 1726882564.34298: _execute() done 19110 1726882564.34300: dumping result to json 19110 1726882564.34323: done dumping result, returning 19110 1726882564.34331: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-000000000316] 19110 1726882564.34336: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000316 19110 1726882564.34661: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000316 19110 1726882564.34667: WORKER PROCESS EXITING ok: [managed_node1] 19110 1726882564.34906: no more pending results, returning what we have 19110 1726882564.34909: results queue empty 19110 1726882564.34909: checking for any_errors_fatal 19110 1726882564.34910: done checking for any_errors_fatal 19110 1726882564.34910: checking for max_fail_percentage 19110 1726882564.34912: done checking for max_fail_percentage 19110 1726882564.34912: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.34913: done checking to see if all hosts have failed 19110 1726882564.34913: getting the remaining hosts for this loop 19110 1726882564.34914: done getting the remaining hosts for this loop 19110 1726882564.34917: getting the next task for host managed_node1 19110 1726882564.34922: done getting next task for host managed_node1 19110 1726882564.34923: ^ task is: TASK: meta (flush_handlers) 19110 1726882564.34925: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.34927: getting variables 19110 1726882564.34928: in VariableManager get_vars() 19110 1726882564.34946: Calling all_inventory to load vars for managed_node1 19110 1726882564.34948: Calling groups_inventory to load vars for managed_node1 19110 1726882564.34950: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.34957: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.34959: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.34961: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.35826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.36771: done with get_vars() 19110 1726882564.36786: done getting variables 19110 1726882564.36833: in VariableManager get_vars() 19110 1726882564.36839: Calling all_inventory to load vars for managed_node1 19110 1726882564.36841: Calling groups_inventory to load vars for managed_node1 19110 1726882564.36842: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.36845: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.36846: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.36848: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.37519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.38464: done with get_vars() 19110 1726882564.38481: done queuing things up, now waiting for results queue to drain 19110 1726882564.38483: results queue empty 19110 1726882564.38484: checking for any_errors_fatal 19110 1726882564.38486: done checking for any_errors_fatal 19110 1726882564.38487: checking for max_fail_percentage 19110 1726882564.38488: done checking for max_fail_percentage 19110 1726882564.38488: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.38492: done checking to see if all hosts have failed 19110 1726882564.38492: getting the remaining hosts for this loop 19110 1726882564.38493: done getting the remaining hosts for this loop 19110 1726882564.38495: getting the next task for host managed_node1 19110 1726882564.38498: done getting next task for host managed_node1 19110 1726882564.38500: ^ task is: TASK: Show network_provider 19110 1726882564.38501: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.38502: getting variables 19110 1726882564.38503: in VariableManager get_vars() 19110 1726882564.38509: Calling all_inventory to load vars for managed_node1 19110 1726882564.38510: Calling groups_inventory to load vars for managed_node1 19110 1726882564.38511: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.38514: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.38516: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.38517: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.39213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.40131: done with get_vars() 19110 1726882564.40143: done getting variables 19110 1726882564.40175: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 21:36:04 -0400 (0:00:00.887) 0:00:21.259 ****** 19110 1726882564.40195: entering _queue_task() for managed_node1/debug 19110 1726882564.40417: worker is 1 (out of 1 available) 19110 1726882564.40429: exiting _queue_task() for managed_node1/debug 19110 1726882564.40439: done queuing things up, now waiting for results queue to drain 19110 1726882564.40440: waiting for pending results... 19110 1726882564.40609: running TaskExecutor() for managed_node1/TASK: Show network_provider 19110 1726882564.40667: in run() - task 0e448fcc-3ce9-5372-c19a-000000000033 19110 1726882564.40679: variable 'ansible_search_path' from source: unknown 19110 1726882564.40706: calling self._execute() 19110 1726882564.40771: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.40775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.40784: variable 'omit' from source: magic vars 19110 1726882564.41198: variable 'ansible_distribution_major_version' from source: facts 19110 1726882564.41226: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882564.41245: variable 'omit' from source: magic vars 19110 1726882564.41297: variable 'omit' from source: magic vars 19110 1726882564.41350: variable 'omit' from source: magic vars 19110 1726882564.41434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882564.41492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882564.41516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882564.41531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882564.41540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882564.41567: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882564.41570: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.41573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.41644: Set connection var ansible_timeout to 10 19110 1726882564.41654: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882564.41661: Set connection var ansible_shell_executable to /bin/sh 19110 1726882564.41665: Set connection var ansible_shell_type to sh 19110 1726882564.41667: Set connection var ansible_connection to ssh 19110 1726882564.41672: Set connection var ansible_pipelining to False 19110 1726882564.41690: variable 'ansible_shell_executable' from source: unknown 19110 1726882564.41693: variable 'ansible_connection' from source: unknown 19110 1726882564.41696: variable 'ansible_module_compression' from source: unknown 19110 1726882564.41699: variable 'ansible_shell_type' from source: unknown 19110 1726882564.41701: variable 'ansible_shell_executable' from source: unknown 19110 1726882564.41703: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.41707: variable 'ansible_pipelining' from source: unknown 19110 1726882564.41709: variable 'ansible_timeout' from source: unknown 19110 1726882564.41711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.41811: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882564.41819: variable 'omit' from source: magic vars 19110 1726882564.41824: starting attempt loop 19110 1726882564.41827: running the handler 19110 1726882564.41869: variable 'network_provider' from source: set_fact 19110 1726882564.41922: variable 'network_provider' from source: set_fact 19110 1726882564.41930: handler run complete 19110 1726882564.41946: attempt loop complete, returning result 19110 1726882564.41949: _execute() done 19110 1726882564.41951: dumping result to json 19110 1726882564.41955: done dumping result, returning 19110 1726882564.41963: done running TaskExecutor() for managed_node1/TASK: Show network_provider [0e448fcc-3ce9-5372-c19a-000000000033] 19110 1726882564.41970: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000033 19110 1726882564.42045: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000033 19110 1726882564.42049: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 19110 1726882564.42362: no more pending results, returning what we have 19110 1726882564.42372: results queue empty 19110 1726882564.42373: checking for any_errors_fatal 19110 1726882564.42376: done checking for any_errors_fatal 19110 1726882564.42379: checking for max_fail_percentage 19110 1726882564.42380: done checking for max_fail_percentage 19110 1726882564.42381: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.42382: done checking to see if all hosts have failed 19110 1726882564.42386: getting the remaining hosts for this loop 19110 1726882564.42388: done getting the remaining hosts for this loop 19110 1726882564.42394: getting the next task for host managed_node1 19110 1726882564.42401: done getting next task for host managed_node1 19110 1726882564.42403: ^ task is: TASK: meta (flush_handlers) 19110 1726882564.42409: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.42415: getting variables 19110 1726882564.42417: in VariableManager get_vars() 19110 1726882564.42458: Calling all_inventory to load vars for managed_node1 19110 1726882564.42465: Calling groups_inventory to load vars for managed_node1 19110 1726882564.42470: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.42487: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.42491: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.42497: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.43982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.46247: done with get_vars() 19110 1726882564.46267: done getting variables 19110 1726882564.46315: in VariableManager get_vars() 19110 1726882564.46321: Calling all_inventory to load vars for managed_node1 19110 1726882564.46323: Calling groups_inventory to load vars for managed_node1 19110 1726882564.46324: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.46327: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.46329: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.46330: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.47199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.49034: done with get_vars() 19110 1726882564.49060: done queuing things up, now waiting for results queue to drain 19110 1726882564.49062: results queue empty 19110 1726882564.49063: checking for any_errors_fatal 19110 1726882564.49067: done checking for any_errors_fatal 19110 1726882564.49068: checking for max_fail_percentage 19110 1726882564.49069: done checking for max_fail_percentage 19110 1726882564.49070: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.49071: done checking to see if all hosts have failed 19110 1726882564.49071: getting the remaining hosts for this loop 19110 1726882564.49072: done getting the remaining hosts for this loop 19110 1726882564.49075: getting the next task for host managed_node1 19110 1726882564.49082: done getting next task for host managed_node1 19110 1726882564.49084: ^ task is: TASK: meta (flush_handlers) 19110 1726882564.49085: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.49088: getting variables 19110 1726882564.49088: in VariableManager get_vars() 19110 1726882564.49096: Calling all_inventory to load vars for managed_node1 19110 1726882564.49098: Calling groups_inventory to load vars for managed_node1 19110 1726882564.49100: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.49105: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.49107: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.49110: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.50826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.52574: done with get_vars() 19110 1726882564.52595: done getting variables 19110 1726882564.52645: in VariableManager get_vars() 19110 1726882564.52657: Calling all_inventory to load vars for managed_node1 19110 1726882564.52659: Calling groups_inventory to load vars for managed_node1 19110 1726882564.52662: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.52668: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.52671: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.52673: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.53919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.55377: done with get_vars() 19110 1726882564.55394: done queuing things up, now waiting for results queue to drain 19110 1726882564.55395: results queue empty 19110 1726882564.55396: checking for any_errors_fatal 19110 1726882564.55397: done checking for any_errors_fatal 19110 1726882564.55397: checking for max_fail_percentage 19110 1726882564.55398: done checking for max_fail_percentage 19110 1726882564.55399: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.55399: done checking to see if all hosts have failed 19110 1726882564.55400: getting the remaining hosts for this loop 19110 1726882564.55400: done getting the remaining hosts for this loop 19110 1726882564.55402: getting the next task for host managed_node1 19110 1726882564.55404: done getting next task for host managed_node1 19110 1726882564.55405: ^ task is: None 19110 1726882564.55406: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.55407: done queuing things up, now waiting for results queue to drain 19110 1726882564.55407: results queue empty 19110 1726882564.55408: checking for any_errors_fatal 19110 1726882564.55408: done checking for any_errors_fatal 19110 1726882564.55408: checking for max_fail_percentage 19110 1726882564.55409: done checking for max_fail_percentage 19110 1726882564.55409: checking to see if all hosts have failed and the running result is not ok 19110 1726882564.55410: done checking to see if all hosts have failed 19110 1726882564.55411: getting the next task for host managed_node1 19110 1726882564.55412: done getting next task for host managed_node1 19110 1726882564.55413: ^ task is: None 19110 1726882564.55413: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.55444: in VariableManager get_vars() 19110 1726882564.55462: done with get_vars() 19110 1726882564.55470: in VariableManager get_vars() 19110 1726882564.55479: done with get_vars() 19110 1726882564.55481: variable 'omit' from source: magic vars 19110 1726882564.55571: variable 'profile' from source: play vars 19110 1726882564.55651: in VariableManager get_vars() 19110 1726882564.55667: done with get_vars() 19110 1726882564.55682: variable 'omit' from source: magic vars 19110 1726882564.55724: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 19110 1726882564.56126: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882564.56145: getting the remaining hosts for this loop 19110 1726882564.56146: done getting the remaining hosts for this loop 19110 1726882564.56148: getting the next task for host managed_node1 19110 1726882564.56150: done getting next task for host managed_node1 19110 1726882564.56151: ^ task is: TASK: Gathering Facts 19110 1726882564.56152: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882564.56153: getting variables 19110 1726882564.56154: in VariableManager get_vars() 19110 1726882564.56166: Calling all_inventory to load vars for managed_node1 19110 1726882564.56167: Calling groups_inventory to load vars for managed_node1 19110 1726882564.56169: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882564.56172: Calling all_plugins_play to load vars for managed_node1 19110 1726882564.56174: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882564.56175: Calling groups_plugins_play to load vars for managed_node1 19110 1726882564.57022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882564.58767: done with get_vars() 19110 1726882564.58787: done getting variables 19110 1726882564.58827: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:36:04 -0400 (0:00:00.186) 0:00:21.445 ****** 19110 1726882564.58851: entering _queue_task() for managed_node1/gather_facts 19110 1726882564.59168: worker is 1 (out of 1 available) 19110 1726882564.59179: exiting _queue_task() for managed_node1/gather_facts 19110 1726882564.59190: done queuing things up, now waiting for results queue to drain 19110 1726882564.59192: waiting for pending results... 19110 1726882564.59461: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882564.59573: in run() - task 0e448fcc-3ce9-5372-c19a-00000000032b 19110 1726882564.59597: variable 'ansible_search_path' from source: unknown 19110 1726882564.59638: calling self._execute() 19110 1726882564.59728: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.59741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.59754: variable 'omit' from source: magic vars 19110 1726882564.60129: variable 'ansible_distribution_major_version' from source: facts 19110 1726882564.60147: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882564.60162: variable 'omit' from source: magic vars 19110 1726882564.60194: variable 'omit' from source: magic vars 19110 1726882564.60232: variable 'omit' from source: magic vars 19110 1726882564.60283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882564.60319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882564.60344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882564.60370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882564.60389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882564.60420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882564.60431: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.60439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.60546: Set connection var ansible_timeout to 10 19110 1726882564.60569: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882564.60579: Set connection var ansible_shell_executable to /bin/sh 19110 1726882564.60585: Set connection var ansible_shell_type to sh 19110 1726882564.60590: Set connection var ansible_connection to ssh 19110 1726882564.60601: Set connection var ansible_pipelining to False 19110 1726882564.60629: variable 'ansible_shell_executable' from source: unknown 19110 1726882564.60636: variable 'ansible_connection' from source: unknown 19110 1726882564.60642: variable 'ansible_module_compression' from source: unknown 19110 1726882564.60648: variable 'ansible_shell_type' from source: unknown 19110 1726882564.60654: variable 'ansible_shell_executable' from source: unknown 19110 1726882564.60667: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882564.60675: variable 'ansible_pipelining' from source: unknown 19110 1726882564.60681: variable 'ansible_timeout' from source: unknown 19110 1726882564.60688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882564.60875: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882564.60892: variable 'omit' from source: magic vars 19110 1726882564.60906: starting attempt loop 19110 1726882564.60913: running the handler 19110 1726882564.60935: variable 'ansible_facts' from source: unknown 19110 1726882564.60958: _low_level_execute_command(): starting 19110 1726882564.60972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882564.61685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882564.61700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.61717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.61736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.61780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.61791: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882564.61803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.61822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882564.61835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882564.61845: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882564.61858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.61873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.61888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.61897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.61906: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882564.61916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.61997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.62018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882564.62033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.62273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.63837: stdout chunk (state=3): >>>/root <<< 19110 1726882564.63939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882564.64018: stderr chunk (state=3): >>><<< 19110 1726882564.64022: stdout chunk (state=3): >>><<< 19110 1726882564.64127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882564.64131: _low_level_execute_command(): starting 19110 1726882564.64133: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538 `" && echo ansible-tmp-1726882564.640404-20077-75948949085538="` echo /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538 `" ) && sleep 0' 19110 1726882564.64713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882564.64729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.64744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.64762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.64809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.64822: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882564.64833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.64847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882564.64859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882564.64870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882564.64881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.64894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.64910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.64925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.64936: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882564.64949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.65030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.65048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882564.65068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.65270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.67108: stdout chunk (state=3): >>>ansible-tmp-1726882564.640404-20077-75948949085538=/root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538 <<< 19110 1726882564.67224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882564.67300: stderr chunk (state=3): >>><<< 19110 1726882564.67310: stdout chunk (state=3): >>><<< 19110 1726882564.67669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882564.640404-20077-75948949085538=/root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882564.67673: variable 'ansible_module_compression' from source: unknown 19110 1726882564.67676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882564.67678: variable 'ansible_facts' from source: unknown 19110 1726882564.67680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/AnsiballZ_setup.py 19110 1726882564.67826: Sending initial data 19110 1726882564.67829: Sent initial data (152 bytes) 19110 1726882564.68831: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882564.68844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.68860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.68884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.68924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.68935: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882564.68947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.68968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882564.68980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882564.68995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882564.69006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.69018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.69032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.69043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.69053: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882564.69071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.69152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.69177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882564.69191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.69314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.71044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882564.71141: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882564.71233: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpe9kr2dcu /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/AnsiballZ_setup.py <<< 19110 1726882564.71326: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882564.74287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882564.74474: stderr chunk (state=3): >>><<< 19110 1726882564.74477: stdout chunk (state=3): >>><<< 19110 1726882564.74479: done transferring module to remote 19110 1726882564.74482: _low_level_execute_command(): starting 19110 1726882564.74488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/ /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/AnsiballZ_setup.py && sleep 0' 19110 1726882564.75962: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.75968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.76005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.76008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.76010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.76069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.76086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.76183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882564.77937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882564.77996: stderr chunk (state=3): >>><<< 19110 1726882564.78000: stdout chunk (state=3): >>><<< 19110 1726882564.78072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882564.78078: _low_level_execute_command(): starting 19110 1726882564.78087: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/AnsiballZ_setup.py && sleep 0' 19110 1726882564.81447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882564.81476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.81512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.81534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.81629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.81644: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882564.81665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.81687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882564.81725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882564.81737: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882564.81750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882564.81771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882564.81792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882564.81833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882564.81853: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882564.81879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882564.82031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882564.82062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882564.82094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882564.82289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882565.35260: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "05", "epoch": "1726882565", "epoch_int": "1726882565", "date": "2024-09-20", "time": "21:36:05", "iso8601_micro": "2024-09-21T01:36:05.069936Z", "iso8601": "2024-09-21T01:36:05Z", "iso8601_basic": "20240920T213605069936", "iso8601_basic_short": "20240920T213605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible<<< 19110 1726882565.35274: stdout chunk (state=3): >>>_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47, "5m": 0.4, "15m": 0.22}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239267840, "block_size": 4096, "block_total": 65519355, "block_available": 64511540, "block_used": 1007815, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::8471:8085:75a2:367", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "ge<<< 19110 1726882565.35280: stdout chunk (state=3): >>>neric_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::8471:8085:75a2:367", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::8471:8085:75a2:367"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882565.37039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882565.37043: stdout chunk (state=3): >>><<< 19110 1726882565.37045: stderr chunk (state=3): >>><<< 19110 1726882565.37375: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "05", "epoch": "1726882565", "epoch_int": "1726882565", "date": "2024-09-20", "time": "21:36:05", "iso8601_micro": "2024-09-21T01:36:05.069936Z", "iso8601": "2024-09-21T01:36:05Z", "iso8601_basic": "20240920T213605069936", "iso8601_basic_short": "20240920T213605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47, "5m": 0.4, "15m": 0.22}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 723, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239267840, "block_size": 4096, "block_total": 65519355, "block_available": 64511540, "block_used": 1007815, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["peerlsr27", "eth0", "lo", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::8471:8085:75a2:367", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::8471:8085:75a2:367", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::8471:8085:75a2:367"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882565.37599: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882565.37623: _low_level_execute_command(): starting 19110 1726882565.37632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882564.640404-20077-75948949085538/ > /dev/null 2>&1 && sleep 0' 19110 1726882565.39657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882565.39708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882565.39722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882565.39739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882565.39842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882565.39854: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882565.39870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882565.39888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882565.39901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882565.39918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882565.39931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882565.39946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882565.39963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882565.39978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882565.39992: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882565.40039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882565.40166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882565.40219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882565.40241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882565.40372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882565.42262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882565.42268: stdout chunk (state=3): >>><<< 19110 1726882565.42271: stderr chunk (state=3): >>><<< 19110 1726882565.42573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882565.42576: handler run complete 19110 1726882565.42579: variable 'ansible_facts' from source: unknown 19110 1726882565.42581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.43019: variable 'ansible_facts' from source: unknown 19110 1726882565.43242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.43523: attempt loop complete, returning result 19110 1726882565.43671: _execute() done 19110 1726882565.43678: dumping result to json 19110 1726882565.43721: done dumping result, returning 19110 1726882565.43733: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-00000000032b] 19110 1726882565.43742: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000032b ok: [managed_node1] 19110 1726882565.45678: no more pending results, returning what we have 19110 1726882565.45681: results queue empty 19110 1726882565.45682: checking for any_errors_fatal 19110 1726882565.45684: done checking for any_errors_fatal 19110 1726882565.45684: checking for max_fail_percentage 19110 1726882565.45686: done checking for max_fail_percentage 19110 1726882565.45687: checking to see if all hosts have failed and the running result is not ok 19110 1726882565.45688: done checking to see if all hosts have failed 19110 1726882565.45689: getting the remaining hosts for this loop 19110 1726882565.45690: done getting the remaining hosts for this loop 19110 1726882565.45694: getting the next task for host managed_node1 19110 1726882565.45700: done getting next task for host managed_node1 19110 1726882565.45702: ^ task is: TASK: meta (flush_handlers) 19110 1726882565.45704: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882565.45708: getting variables 19110 1726882565.45710: in VariableManager get_vars() 19110 1726882565.45765: Calling all_inventory to load vars for managed_node1 19110 1726882565.45768: Calling groups_inventory to load vars for managed_node1 19110 1726882565.45771: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.45783: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.45786: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.45789: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.46833: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000032b 19110 1726882565.46837: WORKER PROCESS EXITING 19110 1726882565.57522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.61020: done with get_vars() 19110 1726882565.61047: done getting variables 19110 1726882565.61109: in VariableManager get_vars() 19110 1726882565.61122: Calling all_inventory to load vars for managed_node1 19110 1726882565.61124: Calling groups_inventory to load vars for managed_node1 19110 1726882565.61127: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.61132: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.61134: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.61140: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.62400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.64248: done with get_vars() 19110 1726882565.64275: done queuing things up, now waiting for results queue to drain 19110 1726882565.64277: results queue empty 19110 1726882565.64278: checking for any_errors_fatal 19110 1726882565.64282: done checking for any_errors_fatal 19110 1726882565.64283: checking for max_fail_percentage 19110 1726882565.64284: done checking for max_fail_percentage 19110 1726882565.64285: checking to see if all hosts have failed and the running result is not ok 19110 1726882565.64285: done checking to see if all hosts have failed 19110 1726882565.64286: getting the remaining hosts for this loop 19110 1726882565.64287: done getting the remaining hosts for this loop 19110 1726882565.64290: getting the next task for host managed_node1 19110 1726882565.64294: done getting next task for host managed_node1 19110 1726882565.64297: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882565.64298: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882565.64307: getting variables 19110 1726882565.64308: in VariableManager get_vars() 19110 1726882565.64321: Calling all_inventory to load vars for managed_node1 19110 1726882565.64324: Calling groups_inventory to load vars for managed_node1 19110 1726882565.64331: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.64336: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.64339: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.64342: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.65580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.67318: done with get_vars() 19110 1726882565.67336: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:05 -0400 (0:00:01.085) 0:00:22.531 ****** 19110 1726882565.67415: entering _queue_task() for managed_node1/include_tasks 19110 1726882565.67756: worker is 1 (out of 1 available) 19110 1726882565.67772: exiting _queue_task() for managed_node1/include_tasks 19110 1726882565.67785: done queuing things up, now waiting for results queue to drain 19110 1726882565.67787: waiting for pending results... 19110 1726882565.68079: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882565.68232: in run() - task 0e448fcc-3ce9-5372-c19a-00000000003c 19110 1726882565.68260: variable 'ansible_search_path' from source: unknown 19110 1726882565.68283: variable 'ansible_search_path' from source: unknown 19110 1726882565.69016: calling self._execute() 19110 1726882565.69112: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882565.69241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882565.69255: variable 'omit' from source: magic vars 19110 1726882565.69982: variable 'ansible_distribution_major_version' from source: facts 19110 1726882565.70120: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882565.70132: _execute() done 19110 1726882565.70140: dumping result to json 19110 1726882565.70147: done dumping result, returning 19110 1726882565.70158: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-5372-c19a-00000000003c] 19110 1726882565.70172: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003c 19110 1726882565.70315: no more pending results, returning what we have 19110 1726882565.70321: in VariableManager get_vars() 19110 1726882565.70367: Calling all_inventory to load vars for managed_node1 19110 1726882565.70370: Calling groups_inventory to load vars for managed_node1 19110 1726882565.70373: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.70385: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.70389: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.70393: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.71503: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003c 19110 1726882565.71507: WORKER PROCESS EXITING 19110 1726882565.72225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.73951: done with get_vars() 19110 1726882565.73973: variable 'ansible_search_path' from source: unknown 19110 1726882565.73975: variable 'ansible_search_path' from source: unknown 19110 1726882565.74003: we have included files to process 19110 1726882565.74004: generating all_blocks data 19110 1726882565.74005: done generating all_blocks data 19110 1726882565.74006: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882565.74007: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882565.74009: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882565.74612: done processing included file 19110 1726882565.74614: iterating over new_blocks loaded from include file 19110 1726882565.74615: in VariableManager get_vars() 19110 1726882565.74636: done with get_vars() 19110 1726882565.74637: filtering new block on tags 19110 1726882565.74653: done filtering new block on tags 19110 1726882565.74655: in VariableManager get_vars() 19110 1726882565.74678: done with get_vars() 19110 1726882565.74680: filtering new block on tags 19110 1726882565.74705: done filtering new block on tags 19110 1726882565.74707: in VariableManager get_vars() 19110 1726882565.74727: done with get_vars() 19110 1726882565.74729: filtering new block on tags 19110 1726882565.74744: done filtering new block on tags 19110 1726882565.74746: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 19110 1726882565.74751: extending task lists for all hosts with included blocks 19110 1726882565.75148: done extending task lists 19110 1726882565.75149: done processing included files 19110 1726882565.75150: results queue empty 19110 1726882565.75151: checking for any_errors_fatal 19110 1726882565.75152: done checking for any_errors_fatal 19110 1726882565.75153: checking for max_fail_percentage 19110 1726882565.75154: done checking for max_fail_percentage 19110 1726882565.75154: checking to see if all hosts have failed and the running result is not ok 19110 1726882565.75155: done checking to see if all hosts have failed 19110 1726882565.75156: getting the remaining hosts for this loop 19110 1726882565.75157: done getting the remaining hosts for this loop 19110 1726882565.75160: getting the next task for host managed_node1 19110 1726882565.75165: done getting next task for host managed_node1 19110 1726882565.75168: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882565.75170: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882565.75178: getting variables 19110 1726882565.75179: in VariableManager get_vars() 19110 1726882565.75193: Calling all_inventory to load vars for managed_node1 19110 1726882565.75195: Calling groups_inventory to load vars for managed_node1 19110 1726882565.75197: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.75202: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.75205: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.75208: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.77354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.79731: done with get_vars() 19110 1726882565.79753: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:05 -0400 (0:00:00.124) 0:00:22.655 ****** 19110 1726882565.79833: entering _queue_task() for managed_node1/setup 19110 1726882565.80185: worker is 1 (out of 1 available) 19110 1726882565.80197: exiting _queue_task() for managed_node1/setup 19110 1726882565.80208: done queuing things up, now waiting for results queue to drain 19110 1726882565.80209: waiting for pending results... 19110 1726882565.80495: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882565.80630: in run() - task 0e448fcc-3ce9-5372-c19a-00000000036c 19110 1726882565.80655: variable 'ansible_search_path' from source: unknown 19110 1726882565.80667: variable 'ansible_search_path' from source: unknown 19110 1726882565.80709: calling self._execute() 19110 1726882565.80804: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882565.80839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882565.80854: variable 'omit' from source: magic vars 19110 1726882565.81351: variable 'ansible_distribution_major_version' from source: facts 19110 1726882565.81373: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882565.81902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882565.85395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882565.86278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882565.86411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882565.86455: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882565.86488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882565.86628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882565.86811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882565.86842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882565.87062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882565.87090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882565.87138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882565.88100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882565.88185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882565.88276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882565.88389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882565.88848: variable '__network_required_facts' from source: role '' defaults 19110 1726882565.89008: variable 'ansible_facts' from source: unknown 19110 1726882565.90459: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19110 1726882565.90529: when evaluation is False, skipping this task 19110 1726882565.90539: _execute() done 19110 1726882565.90546: dumping result to json 19110 1726882565.90554: done dumping result, returning 19110 1726882565.90568: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-5372-c19a-00000000036c] 19110 1726882565.90579: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882565.90722: no more pending results, returning what we have 19110 1726882565.90728: results queue empty 19110 1726882565.90729: checking for any_errors_fatal 19110 1726882565.90730: done checking for any_errors_fatal 19110 1726882565.90731: checking for max_fail_percentage 19110 1726882565.90733: done checking for max_fail_percentage 19110 1726882565.90734: checking to see if all hosts have failed and the running result is not ok 19110 1726882565.90735: done checking to see if all hosts have failed 19110 1726882565.90735: getting the remaining hosts for this loop 19110 1726882565.90737: done getting the remaining hosts for this loop 19110 1726882565.90741: getting the next task for host managed_node1 19110 1726882565.90750: done getting next task for host managed_node1 19110 1726882565.90755: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882565.90757: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882565.90774: getting variables 19110 1726882565.90776: in VariableManager get_vars() 19110 1726882565.90816: Calling all_inventory to load vars for managed_node1 19110 1726882565.90820: Calling groups_inventory to load vars for managed_node1 19110 1726882565.90823: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882565.90833: Calling all_plugins_play to load vars for managed_node1 19110 1726882565.90837: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882565.90840: Calling groups_plugins_play to load vars for managed_node1 19110 1726882565.92673: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036c 19110 1726882565.92677: WORKER PROCESS EXITING 19110 1726882565.95187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882565.99457: done with get_vars() 19110 1726882565.99492: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:05 -0400 (0:00:00.197) 0:00:22.853 ****** 19110 1726882565.99589: entering _queue_task() for managed_node1/stat 19110 1726882566.00916: worker is 1 (out of 1 available) 19110 1726882566.00930: exiting _queue_task() for managed_node1/stat 19110 1726882566.00943: done queuing things up, now waiting for results queue to drain 19110 1726882566.00944: waiting for pending results... 19110 1726882566.02182: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882566.02673: in run() - task 0e448fcc-3ce9-5372-c19a-00000000036e 19110 1726882566.02688: variable 'ansible_search_path' from source: unknown 19110 1726882566.02691: variable 'ansible_search_path' from source: unknown 19110 1726882566.02732: calling self._execute() 19110 1726882566.03272: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882566.03283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882566.03293: variable 'omit' from source: magic vars 19110 1726882566.04336: variable 'ansible_distribution_major_version' from source: facts 19110 1726882566.04348: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882566.04986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882566.05749: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882566.06004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882566.06037: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882566.06277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882566.06575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882566.06599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882566.06628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882566.06882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882566.06973: variable '__network_is_ostree' from source: set_fact 19110 1726882566.07204: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882566.07208: when evaluation is False, skipping this task 19110 1726882566.07211: _execute() done 19110 1726882566.07213: dumping result to json 19110 1726882566.07215: done dumping result, returning 19110 1726882566.07222: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-5372-c19a-00000000036e] 19110 1726882566.07229: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036e 19110 1726882566.07323: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036e 19110 1726882566.07326: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882566.07378: no more pending results, returning what we have 19110 1726882566.07382: results queue empty 19110 1726882566.07383: checking for any_errors_fatal 19110 1726882566.07389: done checking for any_errors_fatal 19110 1726882566.07390: checking for max_fail_percentage 19110 1726882566.07392: done checking for max_fail_percentage 19110 1726882566.07393: checking to see if all hosts have failed and the running result is not ok 19110 1726882566.07394: done checking to see if all hosts have failed 19110 1726882566.07395: getting the remaining hosts for this loop 19110 1726882566.07397: done getting the remaining hosts for this loop 19110 1726882566.07401: getting the next task for host managed_node1 19110 1726882566.07408: done getting next task for host managed_node1 19110 1726882566.07412: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882566.07415: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882566.07428: getting variables 19110 1726882566.07430: in VariableManager get_vars() 19110 1726882566.07473: Calling all_inventory to load vars for managed_node1 19110 1726882566.07476: Calling groups_inventory to load vars for managed_node1 19110 1726882566.07478: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882566.07490: Calling all_plugins_play to load vars for managed_node1 19110 1726882566.07494: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882566.07497: Calling groups_plugins_play to load vars for managed_node1 19110 1726882566.11115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882566.16896: done with get_vars() 19110 1726882566.17147: done getting variables 19110 1726882566.17210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:06 -0400 (0:00:00.176) 0:00:23.029 ****** 19110 1726882566.17245: entering _queue_task() for managed_node1/set_fact 19110 1726882566.18280: worker is 1 (out of 1 available) 19110 1726882566.18291: exiting _queue_task() for managed_node1/set_fact 19110 1726882566.18302: done queuing things up, now waiting for results queue to drain 19110 1726882566.18303: waiting for pending results... 19110 1726882566.19796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882566.19919: in run() - task 0e448fcc-3ce9-5372-c19a-00000000036f 19110 1726882566.20086: variable 'ansible_search_path' from source: unknown 19110 1726882566.20096: variable 'ansible_search_path' from source: unknown 19110 1726882566.20151: calling self._execute() 19110 1726882566.20554: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882566.20679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882566.20694: variable 'omit' from source: magic vars 19110 1726882566.22552: variable 'ansible_distribution_major_version' from source: facts 19110 1726882566.22572: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882566.22830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882566.24733: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882566.24785: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882566.24825: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882566.25206: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882566.25322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882566.25697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882566.25727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882566.26096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882566.26195: variable '__network_is_ostree' from source: set_fact 19110 1726882566.26380: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882566.26389: when evaluation is False, skipping this task 19110 1726882566.26401: _execute() done 19110 1726882566.26407: dumping result to json 19110 1726882566.26432: done dumping result, returning 19110 1726882566.26445: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-5372-c19a-00000000036f] 19110 1726882566.26454: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036f skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882566.26617: no more pending results, returning what we have 19110 1726882566.26621: results queue empty 19110 1726882566.26622: checking for any_errors_fatal 19110 1726882566.26631: done checking for any_errors_fatal 19110 1726882566.26632: checking for max_fail_percentage 19110 1726882566.26634: done checking for max_fail_percentage 19110 1726882566.26635: checking to see if all hosts have failed and the running result is not ok 19110 1726882566.26636: done checking to see if all hosts have failed 19110 1726882566.26636: getting the remaining hosts for this loop 19110 1726882566.26638: done getting the remaining hosts for this loop 19110 1726882566.26641: getting the next task for host managed_node1 19110 1726882566.26650: done getting next task for host managed_node1 19110 1726882566.26654: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882566.26659: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882566.26678: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000036f 19110 1726882566.26682: WORKER PROCESS EXITING 19110 1726882566.26689: getting variables 19110 1726882566.26691: in VariableManager get_vars() 19110 1726882566.26735: Calling all_inventory to load vars for managed_node1 19110 1726882566.26738: Calling groups_inventory to load vars for managed_node1 19110 1726882566.26740: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882566.26750: Calling all_plugins_play to load vars for managed_node1 19110 1726882566.26753: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882566.26757: Calling groups_plugins_play to load vars for managed_node1 19110 1726882566.28568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882566.31471: done with get_vars() 19110 1726882566.31502: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:06 -0400 (0:00:00.143) 0:00:23.173 ****** 19110 1726882566.31609: entering _queue_task() for managed_node1/service_facts 19110 1726882566.31948: worker is 1 (out of 1 available) 19110 1726882566.31962: exiting _queue_task() for managed_node1/service_facts 19110 1726882566.31976: done queuing things up, now waiting for results queue to drain 19110 1726882566.31977: waiting for pending results... 19110 1726882566.32279: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882566.32769: in run() - task 0e448fcc-3ce9-5372-c19a-000000000371 19110 1726882566.32773: variable 'ansible_search_path' from source: unknown 19110 1726882566.32775: variable 'ansible_search_path' from source: unknown 19110 1726882566.32778: calling self._execute() 19110 1726882566.32780: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882566.32782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882566.32785: variable 'omit' from source: magic vars 19110 1726882566.33809: variable 'ansible_distribution_major_version' from source: facts 19110 1726882566.33940: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882566.33943: variable 'omit' from source: magic vars 19110 1726882566.34071: variable 'omit' from source: magic vars 19110 1726882566.34101: variable 'omit' from source: magic vars 19110 1726882566.34256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882566.34295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882566.34313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882566.34336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882566.34345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882566.34513: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882566.34519: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882566.34525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882566.34655: Set connection var ansible_timeout to 10 19110 1726882566.34674: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882566.34678: Set connection var ansible_shell_executable to /bin/sh 19110 1726882566.34680: Set connection var ansible_shell_type to sh 19110 1726882566.34683: Set connection var ansible_connection to ssh 19110 1726882566.34689: Set connection var ansible_pipelining to False 19110 1726882566.34831: variable 'ansible_shell_executable' from source: unknown 19110 1726882566.34834: variable 'ansible_connection' from source: unknown 19110 1726882566.34837: variable 'ansible_module_compression' from source: unknown 19110 1726882566.34840: variable 'ansible_shell_type' from source: unknown 19110 1726882566.34842: variable 'ansible_shell_executable' from source: unknown 19110 1726882566.34844: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882566.34850: variable 'ansible_pipelining' from source: unknown 19110 1726882566.34852: variable 'ansible_timeout' from source: unknown 19110 1726882566.34856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882566.35282: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882566.35292: variable 'omit' from source: magic vars 19110 1726882566.35297: starting attempt loop 19110 1726882566.35300: running the handler 19110 1726882566.35313: _low_level_execute_command(): starting 19110 1726882566.35322: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882566.38699: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882566.38716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.38725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.38738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.38803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.38822: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882566.38832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.38941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882566.38950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882566.38956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882566.38971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.38980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.38992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.38999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.39006: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882566.39016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.39100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882566.39157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882566.39167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882566.39378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882566.41056: stdout chunk (state=3): >>>/root <<< 19110 1726882566.41246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882566.41255: stdout chunk (state=3): >>><<< 19110 1726882566.41267: stderr chunk (state=3): >>><<< 19110 1726882566.41288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882566.41301: _low_level_execute_command(): starting 19110 1726882566.41307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867 `" && echo ansible-tmp-1726882566.4128747-20148-162068969804867="` echo /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867 `" ) && sleep 0' 19110 1726882566.42857: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882566.43019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.43036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.43050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.43094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.43101: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882566.43111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.43130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882566.43143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882566.43150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882566.43160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.43172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.43184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.43191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.43198: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882566.43207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.43314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882566.43370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882566.43381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882566.43565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882566.45427: stdout chunk (state=3): >>>ansible-tmp-1726882566.4128747-20148-162068969804867=/root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867 <<< 19110 1726882566.45580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882566.45617: stderr chunk (state=3): >>><<< 19110 1726882566.45620: stdout chunk (state=3): >>><<< 19110 1726882566.45643: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882566.4128747-20148-162068969804867=/root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882566.45696: variable 'ansible_module_compression' from source: unknown 19110 1726882566.45740: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19110 1726882566.45785: variable 'ansible_facts' from source: unknown 19110 1726882566.45848: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/AnsiballZ_service_facts.py 19110 1726882566.46472: Sending initial data 19110 1726882566.46476: Sent initial data (162 bytes) 19110 1726882566.50747: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.50753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.51206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.51210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.51225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.51231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.51305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882566.51779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882566.51784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882566.51906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882566.53641: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882566.53733: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882566.53830: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpepg5hyl1 /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/AnsiballZ_service_facts.py <<< 19110 1726882566.53923: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882566.55357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882566.55475: stderr chunk (state=3): >>><<< 19110 1726882566.55478: stdout chunk (state=3): >>><<< 19110 1726882566.55494: done transferring module to remote 19110 1726882566.55506: _low_level_execute_command(): starting 19110 1726882566.55511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/ /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/AnsiballZ_service_facts.py && sleep 0' 19110 1726882566.56256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.56263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.56308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.56319: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882566.56330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.56345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882566.56354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882566.56367: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882566.56380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.56392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.56405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.56414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882566.56422: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882566.56431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.56507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882566.56525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882566.56537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882566.56653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882566.58471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882566.58474: stdout chunk (state=3): >>><<< 19110 1726882566.58476: stderr chunk (state=3): >>><<< 19110 1726882566.58554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882566.58558: _low_level_execute_command(): starting 19110 1726882566.58560: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/AnsiballZ_service_facts.py && sleep 0' 19110 1726882566.60731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882566.60735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882566.60775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.60778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882566.60780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882566.60783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882566.60869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882566.61016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882566.61123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882567.91970: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 19110 1726882567.92033: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 19110 1726882567.92041: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19110 1726882567.93444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882567.93448: stdout chunk (state=3): >>><<< 19110 1726882567.93457: stderr chunk (state=3): >>><<< 19110 1726882567.93484: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882567.94417: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882567.94427: _low_level_execute_command(): starting 19110 1726882567.94430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882566.4128747-20148-162068969804867/ > /dev/null 2>&1 && sleep 0' 19110 1726882567.95753: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882567.95767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882567.95779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882567.95793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882567.95830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882567.95837: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882567.95847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882567.95869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882567.95879: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882567.95887: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882567.95894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882567.95904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882567.95917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882567.95922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882567.95928: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882567.95937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882567.96007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882567.96024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882567.96036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882567.96184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882567.97980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882567.98031: stderr chunk (state=3): >>><<< 19110 1726882567.98034: stdout chunk (state=3): >>><<< 19110 1726882567.98277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882567.98280: handler run complete 19110 1726882567.98283: variable 'ansible_facts' from source: unknown 19110 1726882567.98421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882567.98898: variable 'ansible_facts' from source: unknown 19110 1726882567.99045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882567.99269: attempt loop complete, returning result 19110 1726882567.99279: _execute() done 19110 1726882567.99285: dumping result to json 19110 1726882567.99348: done dumping result, returning 19110 1726882567.99374: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-5372-c19a-000000000371] 19110 1726882567.99383: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000371 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882568.00439: no more pending results, returning what we have 19110 1726882568.00442: results queue empty 19110 1726882568.00443: checking for any_errors_fatal 19110 1726882568.00447: done checking for any_errors_fatal 19110 1726882568.00448: checking for max_fail_percentage 19110 1726882568.00449: done checking for max_fail_percentage 19110 1726882568.00450: checking to see if all hosts have failed and the running result is not ok 19110 1726882568.00451: done checking to see if all hosts have failed 19110 1726882568.00452: getting the remaining hosts for this loop 19110 1726882568.00453: done getting the remaining hosts for this loop 19110 1726882568.00457: getting the next task for host managed_node1 19110 1726882568.00463: done getting next task for host managed_node1 19110 1726882568.00468: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882568.00470: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882568.00481: getting variables 19110 1726882568.00483: in VariableManager get_vars() 19110 1726882568.00515: Calling all_inventory to load vars for managed_node1 19110 1726882568.00518: Calling groups_inventory to load vars for managed_node1 19110 1726882568.00520: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882568.00530: Calling all_plugins_play to load vars for managed_node1 19110 1726882568.00533: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882568.00536: Calling groups_plugins_play to load vars for managed_node1 19110 1726882568.01629: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000371 19110 1726882568.01632: WORKER PROCESS EXITING 19110 1726882568.02301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882568.03885: done with get_vars() 19110 1726882568.03903: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:08 -0400 (0:00:01.723) 0:00:24.897 ****** 19110 1726882568.04005: entering _queue_task() for managed_node1/package_facts 19110 1726882568.04311: worker is 1 (out of 1 available) 19110 1726882568.04324: exiting _queue_task() for managed_node1/package_facts 19110 1726882568.04336: done queuing things up, now waiting for results queue to drain 19110 1726882568.04338: waiting for pending results... 19110 1726882568.04641: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882568.04785: in run() - task 0e448fcc-3ce9-5372-c19a-000000000372 19110 1726882568.04805: variable 'ansible_search_path' from source: unknown 19110 1726882568.04813: variable 'ansible_search_path' from source: unknown 19110 1726882568.04865: calling self._execute() 19110 1726882568.04980: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.04992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.05010: variable 'omit' from source: magic vars 19110 1726882568.05448: variable 'ansible_distribution_major_version' from source: facts 19110 1726882568.05472: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882568.05491: variable 'omit' from source: magic vars 19110 1726882568.05548: variable 'omit' from source: magic vars 19110 1726882568.05598: variable 'omit' from source: magic vars 19110 1726882568.05644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882568.05688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882568.05720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882568.05742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882568.05759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882568.05802: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882568.05817: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.05826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.06283: Set connection var ansible_timeout to 10 19110 1726882568.06301: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882568.06311: Set connection var ansible_shell_executable to /bin/sh 19110 1726882568.06317: Set connection var ansible_shell_type to sh 19110 1726882568.06333: Set connection var ansible_connection to ssh 19110 1726882568.06344: Set connection var ansible_pipelining to False 19110 1726882568.06373: variable 'ansible_shell_executable' from source: unknown 19110 1726882568.06381: variable 'ansible_connection' from source: unknown 19110 1726882568.06388: variable 'ansible_module_compression' from source: unknown 19110 1726882568.06395: variable 'ansible_shell_type' from source: unknown 19110 1726882568.06401: variable 'ansible_shell_executable' from source: unknown 19110 1726882568.06407: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.06414: variable 'ansible_pipelining' from source: unknown 19110 1726882568.06420: variable 'ansible_timeout' from source: unknown 19110 1726882568.06426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.06747: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882568.06888: variable 'omit' from source: magic vars 19110 1726882568.06899: starting attempt loop 19110 1726882568.06906: running the handler 19110 1726882568.06923: _low_level_execute_command(): starting 19110 1726882568.06936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882568.08505: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.08533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.08551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.08576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.08622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.08644: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.08661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.08685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.08698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.08709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.08722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.08740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.08766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.08781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.08793: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.08807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.08895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.08920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.08937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.09069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.10685: stdout chunk (state=3): >>>/root <<< 19110 1726882568.11031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882568.11112: stderr chunk (state=3): >>><<< 19110 1726882568.11247: stdout chunk (state=3): >>><<< 19110 1726882568.11365: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882568.11369: _low_level_execute_command(): starting 19110 1726882568.11372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192 `" && echo ansible-tmp-1726882568.1127632-20234-158042278374192="` echo /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192 `" ) && sleep 0' 19110 1726882568.12670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.12781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.12805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.12823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.12866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.12878: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.12893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.12916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.12928: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.12937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.12948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.12960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.12980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.13016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.13031: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.13045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.13121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.13251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.13271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.13397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.15256: stdout chunk (state=3): >>>ansible-tmp-1726882568.1127632-20234-158042278374192=/root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192 <<< 19110 1726882568.15379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882568.15458: stderr chunk (state=3): >>><<< 19110 1726882568.15461: stdout chunk (state=3): >>><<< 19110 1726882568.15626: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882568.1127632-20234-158042278374192=/root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882568.15630: variable 'ansible_module_compression' from source: unknown 19110 1726882568.15636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19110 1726882568.15723: variable 'ansible_facts' from source: unknown 19110 1726882568.15914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/AnsiballZ_package_facts.py 19110 1726882568.16153: Sending initial data 19110 1726882568.16156: Sent initial data (162 bytes) 19110 1726882568.18589: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.18683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.18712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.18827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.18877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.18890: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.18907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.18931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.18951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.18970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.18983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.18995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.19010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.19026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.19037: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.19055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.19136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.19270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.19289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.19412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.21144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882568.21235: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882568.21335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpwp4g6blb /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/AnsiballZ_package_facts.py <<< 19110 1726882568.21429: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882568.24884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882568.25006: stderr chunk (state=3): >>><<< 19110 1726882568.25009: stdout chunk (state=3): >>><<< 19110 1726882568.25011: done transferring module to remote 19110 1726882568.25018: _low_level_execute_command(): starting 19110 1726882568.25020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/ /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/AnsiballZ_package_facts.py && sleep 0' 19110 1726882568.26446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.26459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.26475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.26492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.26531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.26546: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.26559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.26579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.26590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.26599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.26609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.26621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.26634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.26651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.26665: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.26680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.26749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.26887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.26903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.27022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.28892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882568.28896: stdout chunk (state=3): >>><<< 19110 1726882568.28898: stderr chunk (state=3): >>><<< 19110 1726882568.28931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882568.28964: _low_level_execute_command(): starting 19110 1726882568.28975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/AnsiballZ_package_facts.py && sleep 0' 19110 1726882568.30544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.30576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.30607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.30625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.30678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.30701: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.30715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.30732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.30749: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.30774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.30777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.30797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.30810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.30836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.30855: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.30868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.30994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.31027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.31030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.31200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.76830: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 19110 1726882568.76872: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 19110 1726882568.76897: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 19110 1726882568.76901: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 19110 1726882568.76928: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 19110 1726882568.76947: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 19110 1726882568.76965: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 19110 1726882568.76993: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 19110 1726882568.76998: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 19110 1726882568.77014: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 19110 1726882568.77029: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 19110 1726882568.77035: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 19110 1726882568.77067: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 19110 1726882568.77074: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19110 1726882568.78568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882568.78572: stdout chunk (state=3): >>><<< 19110 1726882568.78579: stderr chunk (state=3): >>><<< 19110 1726882568.78636: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882568.81089: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882568.81112: _low_level_execute_command(): starting 19110 1726882568.81115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882568.1127632-20234-158042278374192/ > /dev/null 2>&1 && sleep 0' 19110 1726882568.81755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882568.81772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.81778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.81793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.81831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.81838: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882568.81849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.81871: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882568.81883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882568.81886: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882568.81892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882568.81901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882568.81913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882568.81920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882568.81927: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882568.81936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882568.82013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882568.82028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882568.82037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882568.82159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882568.83968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882568.84006: stderr chunk (state=3): >>><<< 19110 1726882568.84009: stdout chunk (state=3): >>><<< 19110 1726882568.84021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882568.84029: handler run complete 19110 1726882568.84585: variable 'ansible_facts' from source: unknown 19110 1726882568.84892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882568.86762: variable 'ansible_facts' from source: unknown 19110 1726882568.87050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882568.87510: attempt loop complete, returning result 19110 1726882568.87521: _execute() done 19110 1726882568.87524: dumping result to json 19110 1726882568.87654: done dumping result, returning 19110 1726882568.87665: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-5372-c19a-000000000372] 19110 1726882568.87671: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000372 19110 1726882568.90437: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000372 19110 1726882568.90441: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882568.90700: no more pending results, returning what we have 19110 1726882568.90703: results queue empty 19110 1726882568.90704: checking for any_errors_fatal 19110 1726882568.90712: done checking for any_errors_fatal 19110 1726882568.90713: checking for max_fail_percentage 19110 1726882568.90714: done checking for max_fail_percentage 19110 1726882568.90716: checking to see if all hosts have failed and the running result is not ok 19110 1726882568.90717: done checking to see if all hosts have failed 19110 1726882568.90717: getting the remaining hosts for this loop 19110 1726882568.90719: done getting the remaining hosts for this loop 19110 1726882568.90723: getting the next task for host managed_node1 19110 1726882568.90731: done getting next task for host managed_node1 19110 1726882568.90734: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882568.90737: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882568.90748: getting variables 19110 1726882568.90750: in VariableManager get_vars() 19110 1726882568.90797: Calling all_inventory to load vars for managed_node1 19110 1726882568.90801: Calling groups_inventory to load vars for managed_node1 19110 1726882568.90803: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882568.90814: Calling all_plugins_play to load vars for managed_node1 19110 1726882568.90817: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882568.90820: Calling groups_plugins_play to load vars for managed_node1 19110 1726882568.92355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882568.94372: done with get_vars() 19110 1726882568.94394: done getting variables 19110 1726882568.94453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:08 -0400 (0:00:00.904) 0:00:25.802 ****** 19110 1726882568.94499: entering _queue_task() for managed_node1/debug 19110 1726882568.94824: worker is 1 (out of 1 available) 19110 1726882568.94837: exiting _queue_task() for managed_node1/debug 19110 1726882568.94850: done queuing things up, now waiting for results queue to drain 19110 1726882568.94852: waiting for pending results... 19110 1726882568.95161: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882568.95277: in run() - task 0e448fcc-3ce9-5372-c19a-00000000003d 19110 1726882568.95302: variable 'ansible_search_path' from source: unknown 19110 1726882568.95308: variable 'ansible_search_path' from source: unknown 19110 1726882568.95357: calling self._execute() 19110 1726882568.95470: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.95483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.95496: variable 'omit' from source: magic vars 19110 1726882568.95919: variable 'ansible_distribution_major_version' from source: facts 19110 1726882568.95938: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882568.95954: variable 'omit' from source: magic vars 19110 1726882568.96009: variable 'omit' from source: magic vars 19110 1726882568.96124: variable 'network_provider' from source: set_fact 19110 1726882568.96145: variable 'omit' from source: magic vars 19110 1726882568.96210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882568.96253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882568.96284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882568.96308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882568.96332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882568.96370: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882568.96383: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.96390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.96510: Set connection var ansible_timeout to 10 19110 1726882568.96534: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882568.96552: Set connection var ansible_shell_executable to /bin/sh 19110 1726882568.96560: Set connection var ansible_shell_type to sh 19110 1726882568.96570: Set connection var ansible_connection to ssh 19110 1726882568.96581: Set connection var ansible_pipelining to False 19110 1726882568.96612: variable 'ansible_shell_executable' from source: unknown 19110 1726882568.96620: variable 'ansible_connection' from source: unknown 19110 1726882568.96628: variable 'ansible_module_compression' from source: unknown 19110 1726882568.96640: variable 'ansible_shell_type' from source: unknown 19110 1726882568.96655: variable 'ansible_shell_executable' from source: unknown 19110 1726882568.96666: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882568.96675: variable 'ansible_pipelining' from source: unknown 19110 1726882568.96682: variable 'ansible_timeout' from source: unknown 19110 1726882568.96689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882568.96850: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882568.96879: variable 'omit' from source: magic vars 19110 1726882568.96889: starting attempt loop 19110 1726882568.96897: running the handler 19110 1726882568.96951: handler run complete 19110 1726882568.96983: attempt loop complete, returning result 19110 1726882568.96991: _execute() done 19110 1726882568.96997: dumping result to json 19110 1726882568.97004: done dumping result, returning 19110 1726882568.97015: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-5372-c19a-00000000003d] 19110 1726882568.97027: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003d 19110 1726882568.97138: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003d ok: [managed_node1] => {} MSG: Using network provider: nm 19110 1726882568.97200: no more pending results, returning what we have 19110 1726882568.97204: results queue empty 19110 1726882568.97205: checking for any_errors_fatal 19110 1726882568.97215: done checking for any_errors_fatal 19110 1726882568.97216: checking for max_fail_percentage 19110 1726882568.97217: done checking for max_fail_percentage 19110 1726882568.97218: checking to see if all hosts have failed and the running result is not ok 19110 1726882568.97219: done checking to see if all hosts have failed 19110 1726882568.97220: getting the remaining hosts for this loop 19110 1726882568.97222: done getting the remaining hosts for this loop 19110 1726882568.97225: getting the next task for host managed_node1 19110 1726882568.97234: done getting next task for host managed_node1 19110 1726882568.97237: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882568.97239: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882568.97250: getting variables 19110 1726882568.97252: in VariableManager get_vars() 19110 1726882568.97289: Calling all_inventory to load vars for managed_node1 19110 1726882568.97292: Calling groups_inventory to load vars for managed_node1 19110 1726882568.97295: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882568.97305: Calling all_plugins_play to load vars for managed_node1 19110 1726882568.97307: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882568.97310: Calling groups_plugins_play to load vars for managed_node1 19110 1726882568.98316: WORKER PROCESS EXITING 19110 1726882568.99147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.01046: done with get_vars() 19110 1726882569.01081: done getting variables 19110 1726882569.01138: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:09 -0400 (0:00:00.066) 0:00:25.869 ****** 19110 1726882569.01180: entering _queue_task() for managed_node1/fail 19110 1726882569.01478: worker is 1 (out of 1 available) 19110 1726882569.01500: exiting _queue_task() for managed_node1/fail 19110 1726882569.01511: done queuing things up, now waiting for results queue to drain 19110 1726882569.01513: waiting for pending results... 19110 1726882569.01805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882569.01927: in run() - task 0e448fcc-3ce9-5372-c19a-00000000003e 19110 1726882569.01957: variable 'ansible_search_path' from source: unknown 19110 1726882569.01969: variable 'ansible_search_path' from source: unknown 19110 1726882569.02010: calling self._execute() 19110 1726882569.02115: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.02128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.02151: variable 'omit' from source: magic vars 19110 1726882569.02548: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.02566: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.02706: variable 'network_state' from source: role '' defaults 19110 1726882569.02724: Evaluated conditional (network_state != {}): False 19110 1726882569.02731: when evaluation is False, skipping this task 19110 1726882569.02739: _execute() done 19110 1726882569.02745: dumping result to json 19110 1726882569.02753: done dumping result, returning 19110 1726882569.02765: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-5372-c19a-00000000003e] 19110 1726882569.02776: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882569.02917: no more pending results, returning what we have 19110 1726882569.02922: results queue empty 19110 1726882569.02923: checking for any_errors_fatal 19110 1726882569.02929: done checking for any_errors_fatal 19110 1726882569.02930: checking for max_fail_percentage 19110 1726882569.02932: done checking for max_fail_percentage 19110 1726882569.02933: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.02933: done checking to see if all hosts have failed 19110 1726882569.02934: getting the remaining hosts for this loop 19110 1726882569.02936: done getting the remaining hosts for this loop 19110 1726882569.02939: getting the next task for host managed_node1 19110 1726882569.02946: done getting next task for host managed_node1 19110 1726882569.02949: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882569.02952: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.02968: getting variables 19110 1726882569.02970: in VariableManager get_vars() 19110 1726882569.03008: Calling all_inventory to load vars for managed_node1 19110 1726882569.03012: Calling groups_inventory to load vars for managed_node1 19110 1726882569.03014: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.03026: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.03028: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.03031: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.04124: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003e 19110 1726882569.04127: WORKER PROCESS EXITING 19110 1726882569.04967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.06125: done with get_vars() 19110 1726882569.06140: done getting variables 19110 1726882569.06184: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:09 -0400 (0:00:00.050) 0:00:25.919 ****** 19110 1726882569.06205: entering _queue_task() for managed_node1/fail 19110 1726882569.06411: worker is 1 (out of 1 available) 19110 1726882569.06424: exiting _queue_task() for managed_node1/fail 19110 1726882569.06435: done queuing things up, now waiting for results queue to drain 19110 1726882569.06437: waiting for pending results... 19110 1726882569.06611: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882569.06679: in run() - task 0e448fcc-3ce9-5372-c19a-00000000003f 19110 1726882569.06690: variable 'ansible_search_path' from source: unknown 19110 1726882569.06694: variable 'ansible_search_path' from source: unknown 19110 1726882569.06722: calling self._execute() 19110 1726882569.06792: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.06796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.06804: variable 'omit' from source: magic vars 19110 1726882569.07065: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.07076: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.07158: variable 'network_state' from source: role '' defaults 19110 1726882569.07166: Evaluated conditional (network_state != {}): False 19110 1726882569.07168: when evaluation is False, skipping this task 19110 1726882569.07171: _execute() done 19110 1726882569.07173: dumping result to json 19110 1726882569.07177: done dumping result, returning 19110 1726882569.07184: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-5372-c19a-00000000003f] 19110 1726882569.07191: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003f 19110 1726882569.07282: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000003f 19110 1726882569.07285: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882569.07349: no more pending results, returning what we have 19110 1726882569.07352: results queue empty 19110 1726882569.07353: checking for any_errors_fatal 19110 1726882569.07360: done checking for any_errors_fatal 19110 1726882569.07361: checking for max_fail_percentage 19110 1726882569.07362: done checking for max_fail_percentage 19110 1726882569.07363: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.07365: done checking to see if all hosts have failed 19110 1726882569.07366: getting the remaining hosts for this loop 19110 1726882569.07367: done getting the remaining hosts for this loop 19110 1726882569.07370: getting the next task for host managed_node1 19110 1726882569.07374: done getting next task for host managed_node1 19110 1726882569.07378: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882569.07380: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.07391: getting variables 19110 1726882569.07392: in VariableManager get_vars() 19110 1726882569.07418: Calling all_inventory to load vars for managed_node1 19110 1726882569.07420: Calling groups_inventory to load vars for managed_node1 19110 1726882569.07421: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.07431: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.07433: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.07435: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.08297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.09802: done with get_vars() 19110 1726882569.09822: done getting variables 19110 1726882569.09890: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:09 -0400 (0:00:00.037) 0:00:25.956 ****** 19110 1726882569.09916: entering _queue_task() for managed_node1/fail 19110 1726882569.10163: worker is 1 (out of 1 available) 19110 1726882569.10177: exiting _queue_task() for managed_node1/fail 19110 1726882569.10190: done queuing things up, now waiting for results queue to drain 19110 1726882569.10192: waiting for pending results... 19110 1726882569.10472: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882569.10562: in run() - task 0e448fcc-3ce9-5372-c19a-000000000040 19110 1726882569.10588: variable 'ansible_search_path' from source: unknown 19110 1726882569.10592: variable 'ansible_search_path' from source: unknown 19110 1726882569.10614: calling self._execute() 19110 1726882569.10710: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.10714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.10722: variable 'omit' from source: magic vars 19110 1726882569.11096: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.11105: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.11229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.13013: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.13069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.13095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.13120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.13144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.13199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.13218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.13238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.13270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.13281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.13342: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.13355: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19110 1726882569.13358: when evaluation is False, skipping this task 19110 1726882569.13360: _execute() done 19110 1726882569.13363: dumping result to json 19110 1726882569.13373: done dumping result, returning 19110 1726882569.13376: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-5372-c19a-000000000040] 19110 1726882569.13381: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000040 19110 1726882569.13455: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000040 19110 1726882569.13458: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19110 1726882569.13553: no more pending results, returning what we have 19110 1726882569.13557: results queue empty 19110 1726882569.13558: checking for any_errors_fatal 19110 1726882569.13565: done checking for any_errors_fatal 19110 1726882569.13566: checking for max_fail_percentage 19110 1726882569.13567: done checking for max_fail_percentage 19110 1726882569.13568: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.13569: done checking to see if all hosts have failed 19110 1726882569.13570: getting the remaining hosts for this loop 19110 1726882569.13571: done getting the remaining hosts for this loop 19110 1726882569.13574: getting the next task for host managed_node1 19110 1726882569.13579: done getting next task for host managed_node1 19110 1726882569.13583: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882569.13585: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.13597: getting variables 19110 1726882569.13599: in VariableManager get_vars() 19110 1726882569.13627: Calling all_inventory to load vars for managed_node1 19110 1726882569.13632: Calling groups_inventory to load vars for managed_node1 19110 1726882569.13634: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.13642: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.13645: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.13648: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.15038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.15997: done with get_vars() 19110 1726882569.16011: done getting variables 19110 1726882569.16048: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:09 -0400 (0:00:00.061) 0:00:26.017 ****** 19110 1726882569.16073: entering _queue_task() for managed_node1/dnf 19110 1726882569.16251: worker is 1 (out of 1 available) 19110 1726882569.16268: exiting _queue_task() for managed_node1/dnf 19110 1726882569.16280: done queuing things up, now waiting for results queue to drain 19110 1726882569.16282: waiting for pending results... 19110 1726882569.16452: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882569.16526: in run() - task 0e448fcc-3ce9-5372-c19a-000000000041 19110 1726882569.16537: variable 'ansible_search_path' from source: unknown 19110 1726882569.16540: variable 'ansible_search_path' from source: unknown 19110 1726882569.16572: calling self._execute() 19110 1726882569.16644: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.16648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.16655: variable 'omit' from source: magic vars 19110 1726882569.16923: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.16932: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.17075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.19155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.19232: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.19286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.19320: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.19352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.19427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.19462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.19496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.19524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.19535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.19615: variable 'ansible_distribution' from source: facts 19110 1726882569.19620: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.19633: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19110 1726882569.19707: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.19811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.19834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.19869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.19926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.19945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.20005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.20028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.20051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.20090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.20113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.20172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.20208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.20245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.20310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.20332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.20461: variable 'network_connections' from source: play vars 19110 1726882569.20487: variable 'profile' from source: play vars 19110 1726882569.20559: variable 'profile' from source: play vars 19110 1726882569.20572: variable 'interface' from source: set_fact 19110 1726882569.20659: variable 'interface' from source: set_fact 19110 1726882569.20774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882569.21021: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882569.21087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882569.21114: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882569.21140: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882569.21223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882569.21275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882569.21314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.21376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882569.21490: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882569.21885: variable 'network_connections' from source: play vars 19110 1726882569.21888: variable 'profile' from source: play vars 19110 1726882569.21961: variable 'profile' from source: play vars 19110 1726882569.21993: variable 'interface' from source: set_fact 19110 1726882569.22077: variable 'interface' from source: set_fact 19110 1726882569.22095: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882569.22101: when evaluation is False, skipping this task 19110 1726882569.22106: _execute() done 19110 1726882569.22120: dumping result to json 19110 1726882569.22123: done dumping result, returning 19110 1726882569.22137: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000041] 19110 1726882569.22140: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000041 19110 1726882569.22221: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000041 19110 1726882569.22224: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882569.22289: no more pending results, returning what we have 19110 1726882569.22293: results queue empty 19110 1726882569.22296: checking for any_errors_fatal 19110 1726882569.22302: done checking for any_errors_fatal 19110 1726882569.22303: checking for max_fail_percentage 19110 1726882569.22305: done checking for max_fail_percentage 19110 1726882569.22305: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.22306: done checking to see if all hosts have failed 19110 1726882569.22309: getting the remaining hosts for this loop 19110 1726882569.22310: done getting the remaining hosts for this loop 19110 1726882569.22313: getting the next task for host managed_node1 19110 1726882569.22319: done getting next task for host managed_node1 19110 1726882569.22323: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882569.22325: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.22338: getting variables 19110 1726882569.22339: in VariableManager get_vars() 19110 1726882569.22381: Calling all_inventory to load vars for managed_node1 19110 1726882569.22387: Calling groups_inventory to load vars for managed_node1 19110 1726882569.22390: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.22400: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.22402: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.22405: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.27334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.28298: done with get_vars() 19110 1726882569.28313: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882569.28352: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:09 -0400 (0:00:00.122) 0:00:26.141 ****** 19110 1726882569.28373: entering _queue_task() for managed_node1/yum 19110 1726882569.28596: worker is 1 (out of 1 available) 19110 1726882569.28609: exiting _queue_task() for managed_node1/yum 19110 1726882569.28620: done queuing things up, now waiting for results queue to drain 19110 1726882569.28622: waiting for pending results... 19110 1726882569.28803: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882569.28882: in run() - task 0e448fcc-3ce9-5372-c19a-000000000042 19110 1726882569.28892: variable 'ansible_search_path' from source: unknown 19110 1726882569.28896: variable 'ansible_search_path' from source: unknown 19110 1726882569.28926: calling self._execute() 19110 1726882569.28997: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.29002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.29009: variable 'omit' from source: magic vars 19110 1726882569.29444: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.29467: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.29654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.31936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.32024: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.32057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.32121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.32141: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.32220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.32253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.32298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.32340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.32350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.32434: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.32448: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19110 1726882569.32451: when evaluation is False, skipping this task 19110 1726882569.32454: _execute() done 19110 1726882569.32456: dumping result to json 19110 1726882569.32461: done dumping result, returning 19110 1726882569.32470: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000042] 19110 1726882569.32476: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000042 19110 1726882569.32561: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000042 19110 1726882569.32571: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19110 1726882569.32634: no more pending results, returning what we have 19110 1726882569.32638: results queue empty 19110 1726882569.32639: checking for any_errors_fatal 19110 1726882569.32645: done checking for any_errors_fatal 19110 1726882569.32646: checking for max_fail_percentage 19110 1726882569.32648: done checking for max_fail_percentage 19110 1726882569.32649: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.32650: done checking to see if all hosts have failed 19110 1726882569.32651: getting the remaining hosts for this loop 19110 1726882569.32653: done getting the remaining hosts for this loop 19110 1726882569.32656: getting the next task for host managed_node1 19110 1726882569.32661: done getting next task for host managed_node1 19110 1726882569.32667: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882569.32669: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.32688: getting variables 19110 1726882569.32690: in VariableManager get_vars() 19110 1726882569.32727: Calling all_inventory to load vars for managed_node1 19110 1726882569.32730: Calling groups_inventory to load vars for managed_node1 19110 1726882569.32732: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.32771: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.32781: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.32791: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.34142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.35168: done with get_vars() 19110 1726882569.35183: done getting variables 19110 1726882569.35238: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:09 -0400 (0:00:00.068) 0:00:26.209 ****** 19110 1726882569.35267: entering _queue_task() for managed_node1/fail 19110 1726882569.35486: worker is 1 (out of 1 available) 19110 1726882569.35499: exiting _queue_task() for managed_node1/fail 19110 1726882569.35511: done queuing things up, now waiting for results queue to drain 19110 1726882569.35513: waiting for pending results... 19110 1726882569.35695: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882569.35765: in run() - task 0e448fcc-3ce9-5372-c19a-000000000043 19110 1726882569.35775: variable 'ansible_search_path' from source: unknown 19110 1726882569.35779: variable 'ansible_search_path' from source: unknown 19110 1726882569.35809: calling self._execute() 19110 1726882569.35885: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.35889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.35896: variable 'omit' from source: magic vars 19110 1726882569.36306: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.36317: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.36449: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.36654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.38595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.38661: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.38725: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.38755: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.38775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.38848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.38871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.38889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.38915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.38941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.39003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.39033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.39050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.39097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.39115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.39152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.39186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.39204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.39234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.39244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.39358: variable 'network_connections' from source: play vars 19110 1726882569.39370: variable 'profile' from source: play vars 19110 1726882569.39421: variable 'profile' from source: play vars 19110 1726882569.39425: variable 'interface' from source: set_fact 19110 1726882569.39472: variable 'interface' from source: set_fact 19110 1726882569.39538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882569.39645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882569.39677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882569.39699: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882569.39731: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882569.39763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882569.39781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882569.39808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.39834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882569.39896: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882569.40079: variable 'network_connections' from source: play vars 19110 1726882569.40082: variable 'profile' from source: play vars 19110 1726882569.40138: variable 'profile' from source: play vars 19110 1726882569.40141: variable 'interface' from source: set_fact 19110 1726882569.40188: variable 'interface' from source: set_fact 19110 1726882569.40206: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882569.40222: when evaluation is False, skipping this task 19110 1726882569.40226: _execute() done 19110 1726882569.40243: dumping result to json 19110 1726882569.40268: done dumping result, returning 19110 1726882569.40287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000043] 19110 1726882569.40298: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000043 19110 1726882569.40395: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000043 19110 1726882569.40398: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882569.40547: no more pending results, returning what we have 19110 1726882569.40550: results queue empty 19110 1726882569.40551: checking for any_errors_fatal 19110 1726882569.40555: done checking for any_errors_fatal 19110 1726882569.40556: checking for max_fail_percentage 19110 1726882569.40558: done checking for max_fail_percentage 19110 1726882569.40558: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.40559: done checking to see if all hosts have failed 19110 1726882569.40560: getting the remaining hosts for this loop 19110 1726882569.40561: done getting the remaining hosts for this loop 19110 1726882569.40565: getting the next task for host managed_node1 19110 1726882569.40570: done getting next task for host managed_node1 19110 1726882569.40573: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19110 1726882569.40575: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.40612: getting variables 19110 1726882569.40614: in VariableManager get_vars() 19110 1726882569.40639: Calling all_inventory to load vars for managed_node1 19110 1726882569.40641: Calling groups_inventory to load vars for managed_node1 19110 1726882569.40642: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.40648: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.40650: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.40652: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.41844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.43036: done with get_vars() 19110 1726882569.43051: done getting variables 19110 1726882569.43095: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:09 -0400 (0:00:00.078) 0:00:26.288 ****** 19110 1726882569.43130: entering _queue_task() for managed_node1/package 19110 1726882569.43337: worker is 1 (out of 1 available) 19110 1726882569.43352: exiting _queue_task() for managed_node1/package 19110 1726882569.43367: done queuing things up, now waiting for results queue to drain 19110 1726882569.43369: waiting for pending results... 19110 1726882569.43578: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 19110 1726882569.43645: in run() - task 0e448fcc-3ce9-5372-c19a-000000000044 19110 1726882569.43656: variable 'ansible_search_path' from source: unknown 19110 1726882569.43662: variable 'ansible_search_path' from source: unknown 19110 1726882569.43698: calling self._execute() 19110 1726882569.43832: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.43968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.44060: variable 'omit' from source: magic vars 19110 1726882569.44403: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.44458: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.44821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882569.45070: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882569.45128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882569.45165: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882569.45205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882569.45291: variable 'network_packages' from source: role '' defaults 19110 1726882569.45358: variable '__network_provider_setup' from source: role '' defaults 19110 1726882569.45370: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882569.45420: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882569.45427: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882569.45475: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882569.45697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.47681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.47740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.47774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.47805: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.47839: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.47909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.47936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.47960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.48001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.48015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.48058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.48079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.48103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.48142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.48159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.48367: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882569.48465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.48488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.48511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.48550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.48562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.48647: variable 'ansible_python' from source: facts 19110 1726882569.48678: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882569.48747: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882569.48823: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882569.48941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.48967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.48991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.49032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.49052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.49093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.49109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.49137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.49176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.49198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.49326: variable 'network_connections' from source: play vars 19110 1726882569.49330: variable 'profile' from source: play vars 19110 1726882569.49431: variable 'profile' from source: play vars 19110 1726882569.49435: variable 'interface' from source: set_fact 19110 1726882569.49490: variable 'interface' from source: set_fact 19110 1726882569.49560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882569.49582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882569.49602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.49623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882569.49664: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.49840: variable 'network_connections' from source: play vars 19110 1726882569.49844: variable 'profile' from source: play vars 19110 1726882569.49918: variable 'profile' from source: play vars 19110 1726882569.49924: variable 'interface' from source: set_fact 19110 1726882569.49978: variable 'interface' from source: set_fact 19110 1726882569.50003: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882569.50060: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.50260: variable 'network_connections' from source: play vars 19110 1726882569.50265: variable 'profile' from source: play vars 19110 1726882569.50310: variable 'profile' from source: play vars 19110 1726882569.50313: variable 'interface' from source: set_fact 19110 1726882569.50384: variable 'interface' from source: set_fact 19110 1726882569.50402: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882569.50459: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882569.50649: variable 'network_connections' from source: play vars 19110 1726882569.50652: variable 'profile' from source: play vars 19110 1726882569.50704: variable 'profile' from source: play vars 19110 1726882569.50707: variable 'interface' from source: set_fact 19110 1726882569.50779: variable 'interface' from source: set_fact 19110 1726882569.50815: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882569.50861: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882569.50864: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882569.50909: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882569.51045: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882569.51482: variable 'network_connections' from source: play vars 19110 1726882569.51485: variable 'profile' from source: play vars 19110 1726882569.51534: variable 'profile' from source: play vars 19110 1726882569.51537: variable 'interface' from source: set_fact 19110 1726882569.51583: variable 'interface' from source: set_fact 19110 1726882569.51590: variable 'ansible_distribution' from source: facts 19110 1726882569.51593: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.51599: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.51609: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882569.51720: variable 'ansible_distribution' from source: facts 19110 1726882569.51723: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.51727: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.51743: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882569.51851: variable 'ansible_distribution' from source: facts 19110 1726882569.51861: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.51865: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.51882: variable 'network_provider' from source: set_fact 19110 1726882569.51893: variable 'ansible_facts' from source: unknown 19110 1726882569.52928: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19110 1726882569.52931: when evaluation is False, skipping this task 19110 1726882569.52933: _execute() done 19110 1726882569.52936: dumping result to json 19110 1726882569.52937: done dumping result, returning 19110 1726882569.52940: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-5372-c19a-000000000044] 19110 1726882569.52942: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000044 19110 1726882569.53017: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000044 19110 1726882569.53020: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19110 1726882569.53062: no more pending results, returning what we have 19110 1726882569.53068: results queue empty 19110 1726882569.53069: checking for any_errors_fatal 19110 1726882569.53075: done checking for any_errors_fatal 19110 1726882569.53075: checking for max_fail_percentage 19110 1726882569.53077: done checking for max_fail_percentage 19110 1726882569.53078: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.53079: done checking to see if all hosts have failed 19110 1726882569.53080: getting the remaining hosts for this loop 19110 1726882569.53081: done getting the remaining hosts for this loop 19110 1726882569.53084: getting the next task for host managed_node1 19110 1726882569.53090: done getting next task for host managed_node1 19110 1726882569.53094: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882569.53096: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.53108: getting variables 19110 1726882569.53109: in VariableManager get_vars() 19110 1726882569.53143: Calling all_inventory to load vars for managed_node1 19110 1726882569.53145: Calling groups_inventory to load vars for managed_node1 19110 1726882569.53147: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.53160: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.53165: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.53168: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.54711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.56430: done with get_vars() 19110 1726882569.56454: done getting variables 19110 1726882569.56518: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:09 -0400 (0:00:00.134) 0:00:26.422 ****** 19110 1726882569.56550: entering _queue_task() for managed_node1/package 19110 1726882569.56880: worker is 1 (out of 1 available) 19110 1726882569.56892: exiting _queue_task() for managed_node1/package 19110 1726882569.56904: done queuing things up, now waiting for results queue to drain 19110 1726882569.56905: waiting for pending results... 19110 1726882569.57188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882569.57303: in run() - task 0e448fcc-3ce9-5372-c19a-000000000045 19110 1726882569.57324: variable 'ansible_search_path' from source: unknown 19110 1726882569.57330: variable 'ansible_search_path' from source: unknown 19110 1726882569.57372: calling self._execute() 19110 1726882569.57468: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.57482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.57494: variable 'omit' from source: magic vars 19110 1726882569.57849: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.57870: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.58005: variable 'network_state' from source: role '' defaults 19110 1726882569.58021: Evaluated conditional (network_state != {}): False 19110 1726882569.58028: when evaluation is False, skipping this task 19110 1726882569.58035: _execute() done 19110 1726882569.58042: dumping result to json 19110 1726882569.58049: done dumping result, returning 19110 1726882569.58061: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000045] 19110 1726882569.58076: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000045 19110 1726882569.58198: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000045 19110 1726882569.58205: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882569.58258: no more pending results, returning what we have 19110 1726882569.58266: results queue empty 19110 1726882569.58267: checking for any_errors_fatal 19110 1726882569.58274: done checking for any_errors_fatal 19110 1726882569.58274: checking for max_fail_percentage 19110 1726882569.58277: done checking for max_fail_percentage 19110 1726882569.58278: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.58279: done checking to see if all hosts have failed 19110 1726882569.58279: getting the remaining hosts for this loop 19110 1726882569.58281: done getting the remaining hosts for this loop 19110 1726882569.58285: getting the next task for host managed_node1 19110 1726882569.58293: done getting next task for host managed_node1 19110 1726882569.58297: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882569.58300: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.58315: getting variables 19110 1726882569.58317: in VariableManager get_vars() 19110 1726882569.58355: Calling all_inventory to load vars for managed_node1 19110 1726882569.58359: Calling groups_inventory to load vars for managed_node1 19110 1726882569.58361: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.58377: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.58381: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.58384: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.60101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.61798: done with get_vars() 19110 1726882569.61844: done getting variables 19110 1726882569.62073: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:09 -0400 (0:00:00.055) 0:00:26.478 ****** 19110 1726882569.62106: entering _queue_task() for managed_node1/package 19110 1726882569.62402: worker is 1 (out of 1 available) 19110 1726882569.62416: exiting _queue_task() for managed_node1/package 19110 1726882569.62428: done queuing things up, now waiting for results queue to drain 19110 1726882569.62430: waiting for pending results... 19110 1726882569.62827: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882569.63075: in run() - task 0e448fcc-3ce9-5372-c19a-000000000046 19110 1726882569.63098: variable 'ansible_search_path' from source: unknown 19110 1726882569.63105: variable 'ansible_search_path' from source: unknown 19110 1726882569.63142: calling self._execute() 19110 1726882569.63278: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.63377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.63390: variable 'omit' from source: magic vars 19110 1726882569.64235: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.64266: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.64421: variable 'network_state' from source: role '' defaults 19110 1726882569.64437: Evaluated conditional (network_state != {}): False 19110 1726882569.64445: when evaluation is False, skipping this task 19110 1726882569.64452: _execute() done 19110 1726882569.64460: dumping result to json 19110 1726882569.64471: done dumping result, returning 19110 1726882569.64484: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000046] 19110 1726882569.64499: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000046 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882569.64653: no more pending results, returning what we have 19110 1726882569.64657: results queue empty 19110 1726882569.64658: checking for any_errors_fatal 19110 1726882569.64669: done checking for any_errors_fatal 19110 1726882569.64670: checking for max_fail_percentage 19110 1726882569.64671: done checking for max_fail_percentage 19110 1726882569.64673: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.64673: done checking to see if all hosts have failed 19110 1726882569.64674: getting the remaining hosts for this loop 19110 1726882569.64676: done getting the remaining hosts for this loop 19110 1726882569.64680: getting the next task for host managed_node1 19110 1726882569.64687: done getting next task for host managed_node1 19110 1726882569.64691: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882569.64694: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.64709: getting variables 19110 1726882569.64711: in VariableManager get_vars() 19110 1726882569.64751: Calling all_inventory to load vars for managed_node1 19110 1726882569.64754: Calling groups_inventory to load vars for managed_node1 19110 1726882569.64757: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.64772: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.64775: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.64779: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.66184: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000046 19110 1726882569.66191: WORKER PROCESS EXITING 19110 1726882569.67356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.69075: done with get_vars() 19110 1726882569.69099: done getting variables 19110 1726882569.69155: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:09 -0400 (0:00:00.070) 0:00:26.549 ****** 19110 1726882569.69190: entering _queue_task() for managed_node1/service 19110 1726882569.69484: worker is 1 (out of 1 available) 19110 1726882569.69497: exiting _queue_task() for managed_node1/service 19110 1726882569.69510: done queuing things up, now waiting for results queue to drain 19110 1726882569.69512: waiting for pending results... 19110 1726882569.69790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882569.69901: in run() - task 0e448fcc-3ce9-5372-c19a-000000000047 19110 1726882569.69919: variable 'ansible_search_path' from source: unknown 19110 1726882569.69927: variable 'ansible_search_path' from source: unknown 19110 1726882569.69973: calling self._execute() 19110 1726882569.70070: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.70081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.70094: variable 'omit' from source: magic vars 19110 1726882569.70457: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.70476: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.70600: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.70802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.74549: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.74644: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.74696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.74741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.74779: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.74864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.74903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.74936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.74990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.75010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.75102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.75131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.75219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.75315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.75391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.75441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.75513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.75600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.75725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.75751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.76100: variable 'network_connections' from source: play vars 19110 1726882569.76155: variable 'profile' from source: play vars 19110 1726882569.76329: variable 'profile' from source: play vars 19110 1726882569.76366: variable 'interface' from source: set_fact 19110 1726882569.76518: variable 'interface' from source: set_fact 19110 1726882569.76636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882569.77062: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882569.77181: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882569.77216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882569.77287: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882569.77420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882569.77494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882569.77714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.77750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882569.77923: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882569.78253: variable 'network_connections' from source: play vars 19110 1726882569.78268: variable 'profile' from source: play vars 19110 1726882569.78334: variable 'profile' from source: play vars 19110 1726882569.78344: variable 'interface' from source: set_fact 19110 1726882569.78408: variable 'interface' from source: set_fact 19110 1726882569.78442: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882569.78450: when evaluation is False, skipping this task 19110 1726882569.78461: _execute() done 19110 1726882569.78470: dumping result to json 19110 1726882569.78477: done dumping result, returning 19110 1726882569.78487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000047] 19110 1726882569.78502: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000047 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882569.78650: no more pending results, returning what we have 19110 1726882569.78653: results queue empty 19110 1726882569.78654: checking for any_errors_fatal 19110 1726882569.78659: done checking for any_errors_fatal 19110 1726882569.78660: checking for max_fail_percentage 19110 1726882569.78662: done checking for max_fail_percentage 19110 1726882569.78667: checking to see if all hosts have failed and the running result is not ok 19110 1726882569.78667: done checking to see if all hosts have failed 19110 1726882569.78668: getting the remaining hosts for this loop 19110 1726882569.78670: done getting the remaining hosts for this loop 19110 1726882569.78674: getting the next task for host managed_node1 19110 1726882569.78683: done getting next task for host managed_node1 19110 1726882569.78687: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882569.78690: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882569.78705: getting variables 19110 1726882569.78708: in VariableManager get_vars() 19110 1726882569.78749: Calling all_inventory to load vars for managed_node1 19110 1726882569.78752: Calling groups_inventory to load vars for managed_node1 19110 1726882569.78755: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882569.78767: Calling all_plugins_play to load vars for managed_node1 19110 1726882569.78770: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882569.78774: Calling groups_plugins_play to load vars for managed_node1 19110 1726882569.79787: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000047 19110 1726882569.79790: WORKER PROCESS EXITING 19110 1726882569.80841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882569.82983: done with get_vars() 19110 1726882569.83038: done getting variables 19110 1726882569.83150: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:09 -0400 (0:00:00.139) 0:00:26.689 ****** 19110 1726882569.83187: entering _queue_task() for managed_node1/service 19110 1726882569.83552: worker is 1 (out of 1 available) 19110 1726882569.83571: exiting _queue_task() for managed_node1/service 19110 1726882569.83584: done queuing things up, now waiting for results queue to drain 19110 1726882569.83586: waiting for pending results... 19110 1726882569.83893: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882569.84014: in run() - task 0e448fcc-3ce9-5372-c19a-000000000048 19110 1726882569.84273: variable 'ansible_search_path' from source: unknown 19110 1726882569.84277: variable 'ansible_search_path' from source: unknown 19110 1726882569.84279: calling self._execute() 19110 1726882569.84282: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.84285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.84288: variable 'omit' from source: magic vars 19110 1726882569.84589: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.84601: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882569.84781: variable 'network_provider' from source: set_fact 19110 1726882569.84784: variable 'network_state' from source: role '' defaults 19110 1726882569.84801: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19110 1726882569.84816: variable 'omit' from source: magic vars 19110 1726882569.84845: variable 'omit' from source: magic vars 19110 1726882569.84874: variable 'network_service_name' from source: role '' defaults 19110 1726882569.85001: variable 'network_service_name' from source: role '' defaults 19110 1726882569.85138: variable '__network_provider_setup' from source: role '' defaults 19110 1726882569.85375: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882569.85439: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882569.85489: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882569.85555: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882569.85812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882569.89158: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882569.89237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882569.89279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882569.89321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882569.89349: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882569.89433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.89466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.89491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.89538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.89554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.89605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.89627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.89656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.89702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.89713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.90017: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882569.90169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.90172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.90175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.90290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.90293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.90314: variable 'ansible_python' from source: facts 19110 1726882569.90335: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882569.90506: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882569.90629: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882569.91486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.91510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.91540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.91712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.91728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.91781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882569.91928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882569.91953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.91995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882569.92170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882569.92305: variable 'network_connections' from source: play vars 19110 1726882569.92308: variable 'profile' from source: play vars 19110 1726882569.92369: variable 'profile' from source: play vars 19110 1726882569.92373: variable 'interface' from source: set_fact 19110 1726882569.92433: variable 'interface' from source: set_fact 19110 1726882569.92547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882569.93028: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882569.93085: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882569.93246: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882569.93291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882569.93483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882569.93506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882569.93539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882569.93693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882569.93739: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.94299: variable 'network_connections' from source: play vars 19110 1726882569.94305: variable 'profile' from source: play vars 19110 1726882569.94397: variable 'profile' from source: play vars 19110 1726882569.94402: variable 'interface' from source: set_fact 19110 1726882569.94471: variable 'interface' from source: set_fact 19110 1726882569.94502: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882569.94705: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882569.95128: variable 'network_connections' from source: play vars 19110 1726882569.95132: variable 'profile' from source: play vars 19110 1726882569.95215: variable 'profile' from source: play vars 19110 1726882569.95219: variable 'interface' from source: set_fact 19110 1726882569.95299: variable 'interface' from source: set_fact 19110 1726882569.95331: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882569.95419: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882569.95740: variable 'network_connections' from source: play vars 19110 1726882569.95752: variable 'profile' from source: play vars 19110 1726882569.95838: variable 'profile' from source: play vars 19110 1726882569.95841: variable 'interface' from source: set_fact 19110 1726882569.96147: variable 'interface' from source: set_fact 19110 1726882569.96207: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882569.96277: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882569.96284: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882569.97151: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882569.97395: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882569.98175: variable 'network_connections' from source: play vars 19110 1726882569.98185: variable 'profile' from source: play vars 19110 1726882569.98265: variable 'profile' from source: play vars 19110 1726882569.98269: variable 'interface' from source: set_fact 19110 1726882569.98347: variable 'interface' from source: set_fact 19110 1726882569.98356: variable 'ansible_distribution' from source: facts 19110 1726882569.98363: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.98370: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.98384: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882569.98586: variable 'ansible_distribution' from source: facts 19110 1726882569.98589: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.98595: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.98608: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882569.98806: variable 'ansible_distribution' from source: facts 19110 1726882569.98809: variable '__network_rh_distros' from source: role '' defaults 19110 1726882569.98814: variable 'ansible_distribution_major_version' from source: facts 19110 1726882569.98856: variable 'network_provider' from source: set_fact 19110 1726882569.98888: variable 'omit' from source: magic vars 19110 1726882569.98917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882569.98943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882569.98970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882569.98993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882569.99003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882569.99036: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882569.99039: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.99043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.99158: Set connection var ansible_timeout to 10 19110 1726882569.99180: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882569.99185: Set connection var ansible_shell_executable to /bin/sh 19110 1726882569.99188: Set connection var ansible_shell_type to sh 19110 1726882569.99190: Set connection var ansible_connection to ssh 19110 1726882569.99195: Set connection var ansible_pipelining to False 19110 1726882569.99228: variable 'ansible_shell_executable' from source: unknown 19110 1726882569.99232: variable 'ansible_connection' from source: unknown 19110 1726882569.99235: variable 'ansible_module_compression' from source: unknown 19110 1726882569.99237: variable 'ansible_shell_type' from source: unknown 19110 1726882569.99239: variable 'ansible_shell_executable' from source: unknown 19110 1726882569.99241: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882569.99250: variable 'ansible_pipelining' from source: unknown 19110 1726882569.99252: variable 'ansible_timeout' from source: unknown 19110 1726882569.99255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882569.99368: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882569.99383: variable 'omit' from source: magic vars 19110 1726882569.99389: starting attempt loop 19110 1726882569.99391: running the handler 19110 1726882569.99478: variable 'ansible_facts' from source: unknown 19110 1726882570.00389: _low_level_execute_command(): starting 19110 1726882570.00401: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882570.01131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.01146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.01166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.01182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.01220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.01227: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.01237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.01255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.01274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.01281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.01289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.01298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.01309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.01316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.01322: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.01331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.01412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.01427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.01433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.01562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.03234: stdout chunk (state=3): >>>/root <<< 19110 1726882570.03402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882570.03405: stdout chunk (state=3): >>><<< 19110 1726882570.03415: stderr chunk (state=3): >>><<< 19110 1726882570.03433: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882570.03444: _low_level_execute_command(): starting 19110 1726882570.03449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738 `" && echo ansible-tmp-1726882570.0343208-20299-42351370066738="` echo /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738 `" ) && sleep 0' 19110 1726882570.04153: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.04162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.04176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.04190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.04228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.04240: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.04250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.04265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.04273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.04280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.04288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.04297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.04307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.04315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.04321: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.04331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.04407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.04423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.04435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.04560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.06428: stdout chunk (state=3): >>>ansible-tmp-1726882570.0343208-20299-42351370066738=/root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738 <<< 19110 1726882570.06582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882570.06634: stderr chunk (state=3): >>><<< 19110 1726882570.06637: stdout chunk (state=3): >>><<< 19110 1726882570.06659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882570.0343208-20299-42351370066738=/root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882570.06693: variable 'ansible_module_compression' from source: unknown 19110 1726882570.06748: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19110 1726882570.06811: variable 'ansible_facts' from source: unknown 19110 1726882570.07235: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/AnsiballZ_systemd.py 19110 1726882570.07410: Sending initial data 19110 1726882570.07413: Sent initial data (155 bytes) 19110 1726882570.08469: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.08488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.08503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.08522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.08580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.08594: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.08609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.08627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.08640: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.08661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.08679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.08694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.08710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.08722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.08733: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.08747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.08832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.08860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.08885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.09014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.10744: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882570.10838: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882570.10928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmptb_qtql7 /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/AnsiballZ_systemd.py <<< 19110 1726882570.11023: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882570.13883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882570.14086: stderr chunk (state=3): >>><<< 19110 1726882570.14089: stdout chunk (state=3): >>><<< 19110 1726882570.14091: done transferring module to remote 19110 1726882570.14094: _low_level_execute_command(): starting 19110 1726882570.14096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/ /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/AnsiballZ_systemd.py && sleep 0' 19110 1726882570.14952: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.14971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.14984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.14999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.15038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.15048: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.15065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.15082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.15126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.15138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.15149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.15163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.15179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.15188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.15199: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.15210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.15289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.15795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.15810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.15942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.17762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882570.17767: stdout chunk (state=3): >>><<< 19110 1726882570.17770: stderr chunk (state=3): >>><<< 19110 1726882570.17858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882570.17868: _low_level_execute_command(): starting 19110 1726882570.17871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/AnsiballZ_systemd.py && sleep 0' 19110 1726882570.18417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.18432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.18446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.18470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.18513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.18527: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.18542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.18563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.18580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.18592: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.18604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.18619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.18635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.18648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.18667: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.18682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.18754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.18778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.18793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.18977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.44008: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 19110 1726882570.44014: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16142336", "MemoryAvailable": "infinity", "CPUUsageNSec": "939507000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19110 1726882570.45553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882570.45559: stdout chunk (state=3): >>><<< 19110 1726882570.45562: stderr chunk (state=3): >>><<< 19110 1726882570.45583: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16142336", "MemoryAvailable": "infinity", "CPUUsageNSec": "939507000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882570.45749: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882570.45769: _low_level_execute_command(): starting 19110 1726882570.45774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882570.0343208-20299-42351370066738/ > /dev/null 2>&1 && sleep 0' 19110 1726882570.46672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882570.46675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.46677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.46679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.46681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.46683: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882570.46685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.46687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882570.46689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882570.46691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882570.46693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882570.46695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882570.46696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882570.46700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882570.46702: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882570.46706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882570.46709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882570.46710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882570.46712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882570.46731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882570.48582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882570.48585: stdout chunk (state=3): >>><<< 19110 1726882570.48593: stderr chunk (state=3): >>><<< 19110 1726882570.48607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882570.48613: handler run complete 19110 1726882570.48678: attempt loop complete, returning result 19110 1726882570.48681: _execute() done 19110 1726882570.48683: dumping result to json 19110 1726882570.48700: done dumping result, returning 19110 1726882570.48712: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-5372-c19a-000000000048] 19110 1726882570.48717: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000048 19110 1726882570.48987: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000048 19110 1726882570.48990: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882570.49039: no more pending results, returning what we have 19110 1726882570.49042: results queue empty 19110 1726882570.49043: checking for any_errors_fatal 19110 1726882570.49050: done checking for any_errors_fatal 19110 1726882570.49051: checking for max_fail_percentage 19110 1726882570.49052: done checking for max_fail_percentage 19110 1726882570.49053: checking to see if all hosts have failed and the running result is not ok 19110 1726882570.49054: done checking to see if all hosts have failed 19110 1726882570.49057: getting the remaining hosts for this loop 19110 1726882570.49058: done getting the remaining hosts for this loop 19110 1726882570.49062: getting the next task for host managed_node1 19110 1726882570.49069: done getting next task for host managed_node1 19110 1726882570.49072: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882570.49074: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882570.49084: getting variables 19110 1726882570.49085: in VariableManager get_vars() 19110 1726882570.49116: Calling all_inventory to load vars for managed_node1 19110 1726882570.49119: Calling groups_inventory to load vars for managed_node1 19110 1726882570.49121: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882570.49129: Calling all_plugins_play to load vars for managed_node1 19110 1726882570.49132: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882570.49134: Calling groups_plugins_play to load vars for managed_node1 19110 1726882570.50646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882570.52745: done with get_vars() 19110 1726882570.52774: done getting variables 19110 1726882570.52845: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:10 -0400 (0:00:00.696) 0:00:27.386 ****** 19110 1726882570.52885: entering _queue_task() for managed_node1/service 19110 1726882570.53209: worker is 1 (out of 1 available) 19110 1726882570.53224: exiting _queue_task() for managed_node1/service 19110 1726882570.53241: done queuing things up, now waiting for results queue to drain 19110 1726882570.53243: waiting for pending results... 19110 1726882570.53579: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882570.53740: in run() - task 0e448fcc-3ce9-5372-c19a-000000000049 19110 1726882570.53771: variable 'ansible_search_path' from source: unknown 19110 1726882570.53780: variable 'ansible_search_path' from source: unknown 19110 1726882570.53829: calling self._execute() 19110 1726882570.53940: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882570.53952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882570.53971: variable 'omit' from source: magic vars 19110 1726882570.54412: variable 'ansible_distribution_major_version' from source: facts 19110 1726882570.54434: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882570.54581: variable 'network_provider' from source: set_fact 19110 1726882570.54592: Evaluated conditional (network_provider == "nm"): True 19110 1726882570.54703: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882570.54808: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882570.55015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882570.58148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882570.58225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882570.58269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882570.58315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882570.58346: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882570.58435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882570.58474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882570.58514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882570.58565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882570.58585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882570.58642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882570.58743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882570.58842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882570.58890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882570.58958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882570.59005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882570.59082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882570.59190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882570.59234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882570.59394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882570.59645: variable 'network_connections' from source: play vars 19110 1726882570.59718: variable 'profile' from source: play vars 19110 1726882570.59794: variable 'profile' from source: play vars 19110 1726882570.59876: variable 'interface' from source: set_fact 19110 1726882570.60013: variable 'interface' from source: set_fact 19110 1726882570.60692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882570.60969: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882570.61042: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882570.61085: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882570.61131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882570.61186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882570.61215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882570.61254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882570.61290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882570.61354: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882570.61620: variable 'network_connections' from source: play vars 19110 1726882570.61630: variable 'profile' from source: play vars 19110 1726882570.61710: variable 'profile' from source: play vars 19110 1726882570.61719: variable 'interface' from source: set_fact 19110 1726882570.61795: variable 'interface' from source: set_fact 19110 1726882570.61830: Evaluated conditional (__network_wpa_supplicant_required): False 19110 1726882570.61838: when evaluation is False, skipping this task 19110 1726882570.61845: _execute() done 19110 1726882570.61869: dumping result to json 19110 1726882570.61885: done dumping result, returning 19110 1726882570.61897: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-5372-c19a-000000000049] 19110 1726882570.61908: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000049 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19110 1726882570.62074: no more pending results, returning what we have 19110 1726882570.62079: results queue empty 19110 1726882570.62080: checking for any_errors_fatal 19110 1726882570.62102: done checking for any_errors_fatal 19110 1726882570.62103: checking for max_fail_percentage 19110 1726882570.62105: done checking for max_fail_percentage 19110 1726882570.62106: checking to see if all hosts have failed and the running result is not ok 19110 1726882570.62107: done checking to see if all hosts have failed 19110 1726882570.62108: getting the remaining hosts for this loop 19110 1726882570.62110: done getting the remaining hosts for this loop 19110 1726882570.62114: getting the next task for host managed_node1 19110 1726882570.62121: done getting next task for host managed_node1 19110 1726882570.62127: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882570.62130: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882570.62143: getting variables 19110 1726882570.62145: in VariableManager get_vars() 19110 1726882570.62197: Calling all_inventory to load vars for managed_node1 19110 1726882570.62201: Calling groups_inventory to load vars for managed_node1 19110 1726882570.62204: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882570.62215: Calling all_plugins_play to load vars for managed_node1 19110 1726882570.62218: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882570.62221: Calling groups_plugins_play to load vars for managed_node1 19110 1726882570.63619: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000049 19110 1726882570.63623: WORKER PROCESS EXITING 19110 1726882570.65541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882570.67808: done with get_vars() 19110 1726882570.67833: done getting variables 19110 1726882570.67906: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:10 -0400 (0:00:00.150) 0:00:27.536 ****** 19110 1726882570.67936: entering _queue_task() for managed_node1/service 19110 1726882570.68485: worker is 1 (out of 1 available) 19110 1726882570.68499: exiting _queue_task() for managed_node1/service 19110 1726882570.68512: done queuing things up, now waiting for results queue to drain 19110 1726882570.68514: waiting for pending results... 19110 1726882570.69099: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882570.69231: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004a 19110 1726882570.69253: variable 'ansible_search_path' from source: unknown 19110 1726882570.69267: variable 'ansible_search_path' from source: unknown 19110 1726882570.69321: calling self._execute() 19110 1726882570.69430: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882570.69443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882570.69460: variable 'omit' from source: magic vars 19110 1726882570.69896: variable 'ansible_distribution_major_version' from source: facts 19110 1726882570.69914: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882570.70044: variable 'network_provider' from source: set_fact 19110 1726882570.70070: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882570.70078: when evaluation is False, skipping this task 19110 1726882570.70085: _execute() done 19110 1726882570.70092: dumping result to json 19110 1726882570.70099: done dumping result, returning 19110 1726882570.70109: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-5372-c19a-00000000004a] 19110 1726882570.70118: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882570.70273: no more pending results, returning what we have 19110 1726882570.70278: results queue empty 19110 1726882570.70279: checking for any_errors_fatal 19110 1726882570.70289: done checking for any_errors_fatal 19110 1726882570.70290: checking for max_fail_percentage 19110 1726882570.70291: done checking for max_fail_percentage 19110 1726882570.70292: checking to see if all hosts have failed and the running result is not ok 19110 1726882570.70293: done checking to see if all hosts have failed 19110 1726882570.70294: getting the remaining hosts for this loop 19110 1726882570.70296: done getting the remaining hosts for this loop 19110 1726882570.70300: getting the next task for host managed_node1 19110 1726882570.70306: done getting next task for host managed_node1 19110 1726882570.70311: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882570.70314: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882570.70332: getting variables 19110 1726882570.70334: in VariableManager get_vars() 19110 1726882570.70381: Calling all_inventory to load vars for managed_node1 19110 1726882570.70385: Calling groups_inventory to load vars for managed_node1 19110 1726882570.70388: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882570.70401: Calling all_plugins_play to load vars for managed_node1 19110 1726882570.70405: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882570.70408: Calling groups_plugins_play to load vars for managed_node1 19110 1726882570.71550: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004a 19110 1726882570.71553: WORKER PROCESS EXITING 19110 1726882570.72611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882570.76139: done with get_vars() 19110 1726882570.76169: done getting variables 19110 1726882570.76519: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:10 -0400 (0:00:00.086) 0:00:27.622 ****** 19110 1726882570.76553: entering _queue_task() for managed_node1/copy 19110 1726882570.76849: worker is 1 (out of 1 available) 19110 1726882570.76861: exiting _queue_task() for managed_node1/copy 19110 1726882570.76874: done queuing things up, now waiting for results queue to drain 19110 1726882570.76875: waiting for pending results... 19110 1726882570.77144: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882570.77245: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004b 19110 1726882570.77259: variable 'ansible_search_path' from source: unknown 19110 1726882570.77263: variable 'ansible_search_path' from source: unknown 19110 1726882570.77297: calling self._execute() 19110 1726882570.77383: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882570.77387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882570.77397: variable 'omit' from source: magic vars 19110 1726882570.77860: variable 'ansible_distribution_major_version' from source: facts 19110 1726882570.77871: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882570.77992: variable 'network_provider' from source: set_fact 19110 1726882570.77996: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882570.77998: when evaluation is False, skipping this task 19110 1726882570.78003: _execute() done 19110 1726882570.78007: dumping result to json 19110 1726882570.78009: done dumping result, returning 19110 1726882570.78024: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-5372-c19a-00000000004b] 19110 1726882570.78027: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004b 19110 1726882570.78127: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004b 19110 1726882570.78132: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19110 1726882570.78185: no more pending results, returning what we have 19110 1726882570.78189: results queue empty 19110 1726882570.78191: checking for any_errors_fatal 19110 1726882570.78196: done checking for any_errors_fatal 19110 1726882570.78197: checking for max_fail_percentage 19110 1726882570.78198: done checking for max_fail_percentage 19110 1726882570.78199: checking to see if all hosts have failed and the running result is not ok 19110 1726882570.78200: done checking to see if all hosts have failed 19110 1726882570.78201: getting the remaining hosts for this loop 19110 1726882570.78202: done getting the remaining hosts for this loop 19110 1726882570.78206: getting the next task for host managed_node1 19110 1726882570.78213: done getting next task for host managed_node1 19110 1726882570.78217: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882570.78219: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882570.78234: getting variables 19110 1726882570.78236: in VariableManager get_vars() 19110 1726882570.78279: Calling all_inventory to load vars for managed_node1 19110 1726882570.78283: Calling groups_inventory to load vars for managed_node1 19110 1726882570.78285: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882570.78298: Calling all_plugins_play to load vars for managed_node1 19110 1726882570.78301: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882570.78305: Calling groups_plugins_play to load vars for managed_node1 19110 1726882570.80904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882570.83969: done with get_vars() 19110 1726882570.83997: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:10 -0400 (0:00:00.084) 0:00:27.706 ****** 19110 1726882570.84958: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882570.85274: worker is 1 (out of 1 available) 19110 1726882570.85287: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882570.85300: done queuing things up, now waiting for results queue to drain 19110 1726882570.85301: waiting for pending results... 19110 1726882570.85819: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882570.85931: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004c 19110 1726882570.85951: variable 'ansible_search_path' from source: unknown 19110 1726882570.85966: variable 'ansible_search_path' from source: unknown 19110 1726882570.86010: calling self._execute() 19110 1726882570.86395: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882570.86407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882570.86419: variable 'omit' from source: magic vars 19110 1726882570.86880: variable 'ansible_distribution_major_version' from source: facts 19110 1726882570.86896: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882570.86905: variable 'omit' from source: magic vars 19110 1726882570.86944: variable 'omit' from source: magic vars 19110 1726882570.87111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882570.91068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882570.91136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882570.91182: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882570.91224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882570.91253: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882570.91343: variable 'network_provider' from source: set_fact 19110 1726882570.91484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882570.91517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882570.92214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882570.92268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882570.92331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882570.92493: variable 'omit' from source: magic vars 19110 1726882570.92734: variable 'omit' from source: magic vars 19110 1726882570.92853: variable 'network_connections' from source: play vars 19110 1726882570.92877: variable 'profile' from source: play vars 19110 1726882570.92946: variable 'profile' from source: play vars 19110 1726882570.92959: variable 'interface' from source: set_fact 19110 1726882570.93028: variable 'interface' from source: set_fact 19110 1726882570.93184: variable 'omit' from source: magic vars 19110 1726882570.93201: variable '__lsr_ansible_managed' from source: task vars 19110 1726882570.93269: variable '__lsr_ansible_managed' from source: task vars 19110 1726882570.93573: Loaded config def from plugin (lookup/template) 19110 1726882570.93584: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19110 1726882570.93619: File lookup term: get_ansible_managed.j2 19110 1726882570.93632: variable 'ansible_search_path' from source: unknown 19110 1726882570.93642: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19110 1726882570.93664: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19110 1726882570.93689: variable 'ansible_search_path' from source: unknown 19110 1726882571.06917: variable 'ansible_managed' from source: unknown 19110 1726882571.07065: variable 'omit' from source: magic vars 19110 1726882571.07097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882571.07124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882571.07142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882571.07161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.07175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.07207: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882571.07210: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.07213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.07309: Set connection var ansible_timeout to 10 19110 1726882571.07323: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882571.07328: Set connection var ansible_shell_executable to /bin/sh 19110 1726882571.07331: Set connection var ansible_shell_type to sh 19110 1726882571.07333: Set connection var ansible_connection to ssh 19110 1726882571.07338: Set connection var ansible_pipelining to False 19110 1726882571.07361: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.07366: variable 'ansible_connection' from source: unknown 19110 1726882571.07369: variable 'ansible_module_compression' from source: unknown 19110 1726882571.07371: variable 'ansible_shell_type' from source: unknown 19110 1726882571.07373: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.07375: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.07380: variable 'ansible_pipelining' from source: unknown 19110 1726882571.07383: variable 'ansible_timeout' from source: unknown 19110 1726882571.07387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.07512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882571.07528: variable 'omit' from source: magic vars 19110 1726882571.07535: starting attempt loop 19110 1726882571.07538: running the handler 19110 1726882571.07551: _low_level_execute_command(): starting 19110 1726882571.07560: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882571.08238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882571.08250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.08261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.08279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.08331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.08338: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882571.08348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.08361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882571.08373: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.08380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882571.08388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.08399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.08416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.08423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.08432: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882571.08438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.08560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.08641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.08662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.08853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.10454: stdout chunk (state=3): >>>/root <<< 19110 1726882571.10620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882571.10626: stdout chunk (state=3): >>><<< 19110 1726882571.10634: stderr chunk (state=3): >>><<< 19110 1726882571.10658: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882571.10667: _low_level_execute_command(): starting 19110 1726882571.10675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552 `" && echo ansible-tmp-1726882571.1065438-20364-120684244017552="` echo /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552 `" ) && sleep 0' 19110 1726882571.12373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882571.12382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.12395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.12408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.12449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.12457: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882571.12466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.12479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882571.12486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.12493: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882571.12505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.12514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.12524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.12531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.12538: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882571.12547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.12622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.12636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.12647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.12787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.14657: stdout chunk (state=3): >>>ansible-tmp-1726882571.1065438-20364-120684244017552=/root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552 <<< 19110 1726882571.14852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882571.14858: stdout chunk (state=3): >>><<< 19110 1726882571.14861: stderr chunk (state=3): >>><<< 19110 1726882571.14972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882571.1065438-20364-120684244017552=/root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882571.14979: variable 'ansible_module_compression' from source: unknown 19110 1726882571.15171: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19110 1726882571.15175: variable 'ansible_facts' from source: unknown 19110 1726882571.15192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/AnsiballZ_network_connections.py 19110 1726882571.15347: Sending initial data 19110 1726882571.15350: Sent initial data (168 bytes) 19110 1726882571.16308: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882571.16323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.16339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.16362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.16410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.16423: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882571.16438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.16459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882571.16477: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.16493: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882571.16506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.16520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.16537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.16549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.16566: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882571.16582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.16663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.16688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.16710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.16831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.18566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882571.18669: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882571.18763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpsk958274 /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/AnsiballZ_network_connections.py <<< 19110 1726882571.18847: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882571.20970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882571.21199: stderr chunk (state=3): >>><<< 19110 1726882571.21203: stdout chunk (state=3): >>><<< 19110 1726882571.21205: done transferring module to remote 19110 1726882571.21207: _low_level_execute_command(): starting 19110 1726882571.21210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/ /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/AnsiballZ_network_connections.py && sleep 0' 19110 1726882571.22128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882571.22146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.22163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.22193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.22896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.22915: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882571.22983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.23002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882571.23021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.23033: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882571.23045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.23062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.23081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.23093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.23105: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882571.23118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.23275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.23297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.23314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.23439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.25262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882571.25267: stdout chunk (state=3): >>><<< 19110 1726882571.25270: stderr chunk (state=3): >>><<< 19110 1726882571.25351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882571.25361: _low_level_execute_command(): starting 19110 1726882571.25366: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/AnsiballZ_network_connections.py && sleep 0' 19110 1726882571.26052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882571.26074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.26089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.26107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.26146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.26161: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882571.26177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.26195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882571.26206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.26217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882571.26228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.26242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.26262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.26279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882571.26291: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882571.26305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.26385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.26401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.26416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.26546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.52471: stdout chunk (state=3): >>> <<< 19110 1726882571.52476: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19110 1726882571.54579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882571.54583: stdout chunk (state=3): >>><<< 19110 1726882571.54585: stderr chunk (state=3): >>><<< 19110 1726882571.54714: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882571.54717: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882571.54720: _low_level_execute_command(): starting 19110 1726882571.54722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882571.1065438-20364-120684244017552/ > /dev/null 2>&1 && sleep 0' 19110 1726882571.56144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.56148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.56307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882571.56310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882571.56313: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.56315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.56372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882571.56500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.56503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.56609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882571.58400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882571.58476: stderr chunk (state=3): >>><<< 19110 1726882571.58479: stdout chunk (state=3): >>><<< 19110 1726882571.58874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882571.58878: handler run complete 19110 1726882571.58881: attempt loop complete, returning result 19110 1726882571.58883: _execute() done 19110 1726882571.58885: dumping result to json 19110 1726882571.58887: done dumping result, returning 19110 1726882571.58889: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-5372-c19a-00000000004c] 19110 1726882571.58891: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004c 19110 1726882571.58979: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004c 19110 1726882571.58983: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19110 1726882571.59062: no more pending results, returning what we have 19110 1726882571.59068: results queue empty 19110 1726882571.59069: checking for any_errors_fatal 19110 1726882571.59076: done checking for any_errors_fatal 19110 1726882571.59077: checking for max_fail_percentage 19110 1726882571.59079: done checking for max_fail_percentage 19110 1726882571.59080: checking to see if all hosts have failed and the running result is not ok 19110 1726882571.59081: done checking to see if all hosts have failed 19110 1726882571.59081: getting the remaining hosts for this loop 19110 1726882571.59083: done getting the remaining hosts for this loop 19110 1726882571.59086: getting the next task for host managed_node1 19110 1726882571.59091: done getting next task for host managed_node1 19110 1726882571.59095: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882571.59097: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882571.59107: getting variables 19110 1726882571.59109: in VariableManager get_vars() 19110 1726882571.59147: Calling all_inventory to load vars for managed_node1 19110 1726882571.59150: Calling groups_inventory to load vars for managed_node1 19110 1726882571.59153: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882571.59162: Calling all_plugins_play to load vars for managed_node1 19110 1726882571.59171: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882571.59175: Calling groups_plugins_play to load vars for managed_node1 19110 1726882571.62138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882571.65773: done with get_vars() 19110 1726882571.65889: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:11 -0400 (0:00:00.810) 0:00:28.517 ****** 19110 1726882571.65983: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882571.66810: worker is 1 (out of 1 available) 19110 1726882571.66825: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882571.66838: done queuing things up, now waiting for results queue to drain 19110 1726882571.66840: waiting for pending results... 19110 1726882571.68836: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882571.68938: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004d 19110 1726882571.68961: variable 'ansible_search_path' from source: unknown 19110 1726882571.68966: variable 'ansible_search_path' from source: unknown 19110 1726882571.69006: calling self._execute() 19110 1726882571.69104: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.69108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.69117: variable 'omit' from source: magic vars 19110 1726882571.69512: variable 'ansible_distribution_major_version' from source: facts 19110 1726882571.69525: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882571.69653: variable 'network_state' from source: role '' defaults 19110 1726882571.69665: Evaluated conditional (network_state != {}): False 19110 1726882571.69668: when evaluation is False, skipping this task 19110 1726882571.69671: _execute() done 19110 1726882571.69674: dumping result to json 19110 1726882571.69676: done dumping result, returning 19110 1726882571.69684: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-5372-c19a-00000000004d] 19110 1726882571.69691: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004d 19110 1726882571.69788: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004d 19110 1726882571.69792: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882571.69871: no more pending results, returning what we have 19110 1726882571.69876: results queue empty 19110 1726882571.69877: checking for any_errors_fatal 19110 1726882571.69888: done checking for any_errors_fatal 19110 1726882571.69889: checking for max_fail_percentage 19110 1726882571.69891: done checking for max_fail_percentage 19110 1726882571.69892: checking to see if all hosts have failed and the running result is not ok 19110 1726882571.69893: done checking to see if all hosts have failed 19110 1726882571.69894: getting the remaining hosts for this loop 19110 1726882571.69895: done getting the remaining hosts for this loop 19110 1726882571.69899: getting the next task for host managed_node1 19110 1726882571.69905: done getting next task for host managed_node1 19110 1726882571.69910: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882571.69913: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882571.69927: getting variables 19110 1726882571.69929: in VariableManager get_vars() 19110 1726882571.69969: Calling all_inventory to load vars for managed_node1 19110 1726882571.69973: Calling groups_inventory to load vars for managed_node1 19110 1726882571.69975: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882571.69988: Calling all_plugins_play to load vars for managed_node1 19110 1726882571.69991: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882571.69994: Calling groups_plugins_play to load vars for managed_node1 19110 1726882571.71692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882571.73626: done with get_vars() 19110 1726882571.73653: done getting variables 19110 1726882571.73713: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:11 -0400 (0:00:00.077) 0:00:28.594 ****** 19110 1726882571.73750: entering _queue_task() for managed_node1/debug 19110 1726882571.74067: worker is 1 (out of 1 available) 19110 1726882571.74081: exiting _queue_task() for managed_node1/debug 19110 1726882571.74094: done queuing things up, now waiting for results queue to drain 19110 1726882571.74096: waiting for pending results... 19110 1726882571.74389: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882571.74486: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004e 19110 1726882571.74500: variable 'ansible_search_path' from source: unknown 19110 1726882571.74503: variable 'ansible_search_path' from source: unknown 19110 1726882571.74545: calling self._execute() 19110 1726882571.74633: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.74637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.74650: variable 'omit' from source: magic vars 19110 1726882571.75027: variable 'ansible_distribution_major_version' from source: facts 19110 1726882571.75039: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882571.75045: variable 'omit' from source: magic vars 19110 1726882571.75089: variable 'omit' from source: magic vars 19110 1726882571.75121: variable 'omit' from source: magic vars 19110 1726882571.75166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882571.75201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882571.75220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882571.75237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.75248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.75282: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882571.75285: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.75287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.75389: Set connection var ansible_timeout to 10 19110 1726882571.75401: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882571.75412: Set connection var ansible_shell_executable to /bin/sh 19110 1726882571.75415: Set connection var ansible_shell_type to sh 19110 1726882571.75417: Set connection var ansible_connection to ssh 19110 1726882571.75423: Set connection var ansible_pipelining to False 19110 1726882571.75445: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.75448: variable 'ansible_connection' from source: unknown 19110 1726882571.75451: variable 'ansible_module_compression' from source: unknown 19110 1726882571.75453: variable 'ansible_shell_type' from source: unknown 19110 1726882571.75458: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.75461: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.75465: variable 'ansible_pipelining' from source: unknown 19110 1726882571.75467: variable 'ansible_timeout' from source: unknown 19110 1726882571.75470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.75619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882571.75635: variable 'omit' from source: magic vars 19110 1726882571.75640: starting attempt loop 19110 1726882571.75642: running the handler 19110 1726882571.75761: variable '__network_connections_result' from source: set_fact 19110 1726882571.75820: handler run complete 19110 1726882571.75839: attempt loop complete, returning result 19110 1726882571.75843: _execute() done 19110 1726882571.75851: dumping result to json 19110 1726882571.75854: done dumping result, returning 19110 1726882571.75866: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000004e] 19110 1726882571.75871: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004e 19110 1726882571.75954: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004e 19110 1726882571.75960: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 19110 1726882571.76023: no more pending results, returning what we have 19110 1726882571.76027: results queue empty 19110 1726882571.76028: checking for any_errors_fatal 19110 1726882571.76034: done checking for any_errors_fatal 19110 1726882571.76034: checking for max_fail_percentage 19110 1726882571.76036: done checking for max_fail_percentage 19110 1726882571.76037: checking to see if all hosts have failed and the running result is not ok 19110 1726882571.76038: done checking to see if all hosts have failed 19110 1726882571.76038: getting the remaining hosts for this loop 19110 1726882571.76040: done getting the remaining hosts for this loop 19110 1726882571.76044: getting the next task for host managed_node1 19110 1726882571.76049: done getting next task for host managed_node1 19110 1726882571.76053: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882571.76056: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882571.76067: getting variables 19110 1726882571.76069: in VariableManager get_vars() 19110 1726882571.76106: Calling all_inventory to load vars for managed_node1 19110 1726882571.76108: Calling groups_inventory to load vars for managed_node1 19110 1726882571.76110: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882571.76120: Calling all_plugins_play to load vars for managed_node1 19110 1726882571.76123: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882571.76126: Calling groups_plugins_play to load vars for managed_node1 19110 1726882571.78627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882571.81832: done with get_vars() 19110 1726882571.81861: done getting variables 19110 1726882571.82038: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:11 -0400 (0:00:00.083) 0:00:28.677 ****** 19110 1726882571.82071: entering _queue_task() for managed_node1/debug 19110 1726882571.82729: worker is 1 (out of 1 available) 19110 1726882571.82742: exiting _queue_task() for managed_node1/debug 19110 1726882571.82755: done queuing things up, now waiting for results queue to drain 19110 1726882571.82757: waiting for pending results... 19110 1726882571.83679: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882571.83775: in run() - task 0e448fcc-3ce9-5372-c19a-00000000004f 19110 1726882571.83791: variable 'ansible_search_path' from source: unknown 19110 1726882571.83795: variable 'ansible_search_path' from source: unknown 19110 1726882571.83827: calling self._execute() 19110 1726882571.83916: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.83920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.83929: variable 'omit' from source: magic vars 19110 1726882571.84291: variable 'ansible_distribution_major_version' from source: facts 19110 1726882571.84307: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882571.84313: variable 'omit' from source: magic vars 19110 1726882571.84346: variable 'omit' from source: magic vars 19110 1726882571.84384: variable 'omit' from source: magic vars 19110 1726882571.84428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882571.84462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882571.84481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882571.84498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.84510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.84545: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882571.84548: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.84551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.84658: Set connection var ansible_timeout to 10 19110 1726882571.84671: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882571.84676: Set connection var ansible_shell_executable to /bin/sh 19110 1726882571.84679: Set connection var ansible_shell_type to sh 19110 1726882571.84681: Set connection var ansible_connection to ssh 19110 1726882571.84687: Set connection var ansible_pipelining to False 19110 1726882571.84708: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.84711: variable 'ansible_connection' from source: unknown 19110 1726882571.84713: variable 'ansible_module_compression' from source: unknown 19110 1726882571.84716: variable 'ansible_shell_type' from source: unknown 19110 1726882571.84718: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.84720: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.84724: variable 'ansible_pipelining' from source: unknown 19110 1726882571.84733: variable 'ansible_timeout' from source: unknown 19110 1726882571.84738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.84876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882571.84886: variable 'omit' from source: magic vars 19110 1726882571.84891: starting attempt loop 19110 1726882571.84894: running the handler 19110 1726882571.84940: variable '__network_connections_result' from source: set_fact 19110 1726882571.85018: variable '__network_connections_result' from source: set_fact 19110 1726882571.85125: handler run complete 19110 1726882571.85175: attempt loop complete, returning result 19110 1726882571.85178: _execute() done 19110 1726882571.85181: dumping result to json 19110 1726882571.85183: done dumping result, returning 19110 1726882571.85185: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000004f] 19110 1726882571.85187: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004f 19110 1726882571.85295: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000004f 19110 1726882571.85298: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19110 1726882571.85393: no more pending results, returning what we have 19110 1726882571.85396: results queue empty 19110 1726882571.85397: checking for any_errors_fatal 19110 1726882571.85402: done checking for any_errors_fatal 19110 1726882571.85403: checking for max_fail_percentage 19110 1726882571.85404: done checking for max_fail_percentage 19110 1726882571.85405: checking to see if all hosts have failed and the running result is not ok 19110 1726882571.85406: done checking to see if all hosts have failed 19110 1726882571.85406: getting the remaining hosts for this loop 19110 1726882571.85408: done getting the remaining hosts for this loop 19110 1726882571.85411: getting the next task for host managed_node1 19110 1726882571.85416: done getting next task for host managed_node1 19110 1726882571.85419: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882571.85422: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882571.85430: getting variables 19110 1726882571.85431: in VariableManager get_vars() 19110 1726882571.85466: Calling all_inventory to load vars for managed_node1 19110 1726882571.85469: Calling groups_inventory to load vars for managed_node1 19110 1726882571.85471: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882571.85481: Calling all_plugins_play to load vars for managed_node1 19110 1726882571.85484: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882571.85487: Calling groups_plugins_play to load vars for managed_node1 19110 1726882571.88567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882571.90356: done with get_vars() 19110 1726882571.90380: done getting variables 19110 1726882571.90437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:11 -0400 (0:00:00.083) 0:00:28.761 ****** 19110 1726882571.90476: entering _queue_task() for managed_node1/debug 19110 1726882571.90873: worker is 1 (out of 1 available) 19110 1726882571.90887: exiting _queue_task() for managed_node1/debug 19110 1726882571.90904: done queuing things up, now waiting for results queue to drain 19110 1726882571.90905: waiting for pending results... 19110 1726882571.91194: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882571.91300: in run() - task 0e448fcc-3ce9-5372-c19a-000000000050 19110 1726882571.91314: variable 'ansible_search_path' from source: unknown 19110 1726882571.91317: variable 'ansible_search_path' from source: unknown 19110 1726882571.91361: calling self._execute() 19110 1726882571.91451: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.91461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.91472: variable 'omit' from source: magic vars 19110 1726882571.91834: variable 'ansible_distribution_major_version' from source: facts 19110 1726882571.91847: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882571.91976: variable 'network_state' from source: role '' defaults 19110 1726882571.91985: Evaluated conditional (network_state != {}): False 19110 1726882571.91992: when evaluation is False, skipping this task 19110 1726882571.91995: _execute() done 19110 1726882571.91998: dumping result to json 19110 1726882571.92006: done dumping result, returning 19110 1726882571.92014: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-5372-c19a-000000000050] 19110 1726882571.92021: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000050 19110 1726882571.92114: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000050 19110 1726882571.92117: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 19110 1726882571.92166: no more pending results, returning what we have 19110 1726882571.92170: results queue empty 19110 1726882571.92171: checking for any_errors_fatal 19110 1726882571.92178: done checking for any_errors_fatal 19110 1726882571.92179: checking for max_fail_percentage 19110 1726882571.92180: done checking for max_fail_percentage 19110 1726882571.92181: checking to see if all hosts have failed and the running result is not ok 19110 1726882571.92182: done checking to see if all hosts have failed 19110 1726882571.92183: getting the remaining hosts for this loop 19110 1726882571.92185: done getting the remaining hosts for this loop 19110 1726882571.92188: getting the next task for host managed_node1 19110 1726882571.92195: done getting next task for host managed_node1 19110 1726882571.92201: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882571.92204: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882571.92218: getting variables 19110 1726882571.92220: in VariableManager get_vars() 19110 1726882571.92260: Calling all_inventory to load vars for managed_node1 19110 1726882571.92263: Calling groups_inventory to load vars for managed_node1 19110 1726882571.92267: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882571.92279: Calling all_plugins_play to load vars for managed_node1 19110 1726882571.92283: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882571.92285: Calling groups_plugins_play to load vars for managed_node1 19110 1726882571.93869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882571.95795: done with get_vars() 19110 1726882571.95823: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:11 -0400 (0:00:00.054) 0:00:28.816 ****** 19110 1726882571.95913: entering _queue_task() for managed_node1/ping 19110 1726882571.96152: worker is 1 (out of 1 available) 19110 1726882571.96165: exiting _queue_task() for managed_node1/ping 19110 1726882571.96178: done queuing things up, now waiting for results queue to drain 19110 1726882571.96180: waiting for pending results... 19110 1726882571.96466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882571.96561: in run() - task 0e448fcc-3ce9-5372-c19a-000000000051 19110 1726882571.96574: variable 'ansible_search_path' from source: unknown 19110 1726882571.96578: variable 'ansible_search_path' from source: unknown 19110 1726882571.96614: calling self._execute() 19110 1726882571.96703: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.96707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.96717: variable 'omit' from source: magic vars 19110 1726882571.97096: variable 'ansible_distribution_major_version' from source: facts 19110 1726882571.97108: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882571.97114: variable 'omit' from source: magic vars 19110 1726882571.97157: variable 'omit' from source: magic vars 19110 1726882571.97195: variable 'omit' from source: magic vars 19110 1726882571.97239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882571.97276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882571.97296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882571.97314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.97324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882571.97359: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882571.97362: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.97367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.97453: Set connection var ansible_timeout to 10 19110 1726882571.97468: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882571.97473: Set connection var ansible_shell_executable to /bin/sh 19110 1726882571.97476: Set connection var ansible_shell_type to sh 19110 1726882571.97478: Set connection var ansible_connection to ssh 19110 1726882571.97483: Set connection var ansible_pipelining to False 19110 1726882571.97510: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.97513: variable 'ansible_connection' from source: unknown 19110 1726882571.97516: variable 'ansible_module_compression' from source: unknown 19110 1726882571.97518: variable 'ansible_shell_type' from source: unknown 19110 1726882571.97521: variable 'ansible_shell_executable' from source: unknown 19110 1726882571.97523: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882571.97525: variable 'ansible_pipelining' from source: unknown 19110 1726882571.97529: variable 'ansible_timeout' from source: unknown 19110 1726882571.97532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882571.97739: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882571.97749: variable 'omit' from source: magic vars 19110 1726882571.97754: starting attempt loop 19110 1726882571.97759: running the handler 19110 1726882571.97771: _low_level_execute_command(): starting 19110 1726882571.97784: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882571.99065: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.99075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882571.99120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.99124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882571.99139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882571.99146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882571.99236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882571.99259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882571.99411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.01073: stdout chunk (state=3): >>>/root <<< 19110 1726882572.01231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.01237: stderr chunk (state=3): >>><<< 19110 1726882572.01241: stdout chunk (state=3): >>><<< 19110 1726882572.01267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.01281: _low_level_execute_command(): starting 19110 1726882572.01287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994 `" && echo ansible-tmp-1726882572.0126731-20428-3423692407994="` echo /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994 `" ) && sleep 0' 19110 1726882572.03004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.03007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.03046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.03059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882572.03074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.03080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882572.03093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.03167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.03172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.03495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.03592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.05458: stdout chunk (state=3): >>>ansible-tmp-1726882572.0126731-20428-3423692407994=/root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994 <<< 19110 1726882572.05619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.05623: stderr chunk (state=3): >>><<< 19110 1726882572.05628: stdout chunk (state=3): >>><<< 19110 1726882572.05646: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882572.0126731-20428-3423692407994=/root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.05695: variable 'ansible_module_compression' from source: unknown 19110 1726882572.05737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19110 1726882572.05773: variable 'ansible_facts' from source: unknown 19110 1726882572.05853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/AnsiballZ_ping.py 19110 1726882572.06412: Sending initial data 19110 1726882572.06415: Sent initial data (151 bytes) 19110 1726882572.08694: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.08700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.08741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.08746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 19110 1726882572.08752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.08774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.08791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.08859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.08883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.09169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.10733: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882572.10825: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882572.10925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp62khbn9z /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/AnsiballZ_ping.py <<< 19110 1726882572.11018: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882572.12761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.12942: stderr chunk (state=3): >>><<< 19110 1726882572.12945: stdout chunk (state=3): >>><<< 19110 1726882572.12947: done transferring module to remote 19110 1726882572.12950: _low_level_execute_command(): starting 19110 1726882572.12952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/ /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/AnsiballZ_ping.py && sleep 0' 19110 1726882572.15498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.15514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.15530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.15549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.15596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.15609: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.15624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.15643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.15659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.15673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.15686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.15700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.15721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.15735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.15747: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.15766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.15841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.15982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.15998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.16288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.18091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.18094: stdout chunk (state=3): >>><<< 19110 1726882572.18097: stderr chunk (state=3): >>><<< 19110 1726882572.18170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.18174: _low_level_execute_command(): starting 19110 1726882572.18176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/AnsiballZ_ping.py && sleep 0' 19110 1726882572.19829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.19842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.19858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.19881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.19922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.19934: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.19947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.19968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.19981: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.19991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.20001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.20013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.20028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.20041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.20053: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.20073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.20148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.20169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.20186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.20488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.33399: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19110 1726882572.34476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882572.34480: stdout chunk (state=3): >>><<< 19110 1726882572.34483: stderr chunk (state=3): >>><<< 19110 1726882572.34608: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882572.34612: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882572.34614: _low_level_execute_command(): starting 19110 1726882572.34616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882572.0126731-20428-3423692407994/ > /dev/null 2>&1 && sleep 0' 19110 1726882572.35185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.35198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.35211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.35227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.35277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.35290: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.35307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.35323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.35334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.35347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.35366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.35387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.35402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.35413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.35422: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.35434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.35516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.35538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.35568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.35714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.37566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.37569: stdout chunk (state=3): >>><<< 19110 1726882572.37572: stderr chunk (state=3): >>><<< 19110 1726882572.38007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.38010: handler run complete 19110 1726882572.38013: attempt loop complete, returning result 19110 1726882572.38015: _execute() done 19110 1726882572.38017: dumping result to json 19110 1726882572.38018: done dumping result, returning 19110 1726882572.38020: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-5372-c19a-000000000051] 19110 1726882572.38022: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000051 19110 1726882572.38099: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000051 19110 1726882572.38102: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 19110 1726882572.38158: no more pending results, returning what we have 19110 1726882572.38162: results queue empty 19110 1726882572.38163: checking for any_errors_fatal 19110 1726882572.38170: done checking for any_errors_fatal 19110 1726882572.38171: checking for max_fail_percentage 19110 1726882572.38173: done checking for max_fail_percentage 19110 1726882572.38174: checking to see if all hosts have failed and the running result is not ok 19110 1726882572.38174: done checking to see if all hosts have failed 19110 1726882572.38175: getting the remaining hosts for this loop 19110 1726882572.38176: done getting the remaining hosts for this loop 19110 1726882572.38180: getting the next task for host managed_node1 19110 1726882572.38188: done getting next task for host managed_node1 19110 1726882572.38191: ^ task is: TASK: meta (role_complete) 19110 1726882572.38193: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.38204: getting variables 19110 1726882572.38205: in VariableManager get_vars() 19110 1726882572.38241: Calling all_inventory to load vars for managed_node1 19110 1726882572.38244: Calling groups_inventory to load vars for managed_node1 19110 1726882572.38247: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.38256: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.38259: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.38262: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.40982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.42748: done with get_vars() 19110 1726882572.42777: done getting variables 19110 1726882572.42860: done queuing things up, now waiting for results queue to drain 19110 1726882572.42862: results queue empty 19110 1726882572.42864: checking for any_errors_fatal 19110 1726882572.42867: done checking for any_errors_fatal 19110 1726882572.42868: checking for max_fail_percentage 19110 1726882572.42869: done checking for max_fail_percentage 19110 1726882572.42870: checking to see if all hosts have failed and the running result is not ok 19110 1726882572.42871: done checking to see if all hosts have failed 19110 1726882572.42872: getting the remaining hosts for this loop 19110 1726882572.42873: done getting the remaining hosts for this loop 19110 1726882572.42875: getting the next task for host managed_node1 19110 1726882572.42878: done getting next task for host managed_node1 19110 1726882572.42880: ^ task is: TASK: meta (flush_handlers) 19110 1726882572.42881: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.42884: getting variables 19110 1726882572.42885: in VariableManager get_vars() 19110 1726882572.42897: Calling all_inventory to load vars for managed_node1 19110 1726882572.42899: Calling groups_inventory to load vars for managed_node1 19110 1726882572.42901: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.42906: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.42909: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.42912: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.48365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.50029: done with get_vars() 19110 1726882572.50052: done getting variables 19110 1726882572.50107: in VariableManager get_vars() 19110 1726882572.50118: Calling all_inventory to load vars for managed_node1 19110 1726882572.50120: Calling groups_inventory to load vars for managed_node1 19110 1726882572.50122: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.50126: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.50128: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.50134: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.51345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.53159: done with get_vars() 19110 1726882572.53184: done queuing things up, now waiting for results queue to drain 19110 1726882572.53186: results queue empty 19110 1726882572.53186: checking for any_errors_fatal 19110 1726882572.53188: done checking for any_errors_fatal 19110 1726882572.53188: checking for max_fail_percentage 19110 1726882572.53189: done checking for max_fail_percentage 19110 1726882572.53190: checking to see if all hosts have failed and the running result is not ok 19110 1726882572.53191: done checking to see if all hosts have failed 19110 1726882572.53192: getting the remaining hosts for this loop 19110 1726882572.53192: done getting the remaining hosts for this loop 19110 1726882572.53195: getting the next task for host managed_node1 19110 1726882572.53199: done getting next task for host managed_node1 19110 1726882572.53200: ^ task is: TASK: meta (flush_handlers) 19110 1726882572.53202: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.53205: getting variables 19110 1726882572.53206: in VariableManager get_vars() 19110 1726882572.53215: Calling all_inventory to load vars for managed_node1 19110 1726882572.53217: Calling groups_inventory to load vars for managed_node1 19110 1726882572.53218: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.53223: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.53225: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.53227: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.54463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.56512: done with get_vars() 19110 1726882572.56532: done getting variables 19110 1726882572.56583: in VariableManager get_vars() 19110 1726882572.56595: Calling all_inventory to load vars for managed_node1 19110 1726882572.56597: Calling groups_inventory to load vars for managed_node1 19110 1726882572.56599: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.56603: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.56606: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.56609: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.58091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.61732: done with get_vars() 19110 1726882572.61760: done queuing things up, now waiting for results queue to drain 19110 1726882572.61762: results queue empty 19110 1726882572.61765: checking for any_errors_fatal 19110 1726882572.61766: done checking for any_errors_fatal 19110 1726882572.61767: checking for max_fail_percentage 19110 1726882572.61768: done checking for max_fail_percentage 19110 1726882572.61769: checking to see if all hosts have failed and the running result is not ok 19110 1726882572.61770: done checking to see if all hosts have failed 19110 1726882572.61770: getting the remaining hosts for this loop 19110 1726882572.61771: done getting the remaining hosts for this loop 19110 1726882572.61774: getting the next task for host managed_node1 19110 1726882572.61777: done getting next task for host managed_node1 19110 1726882572.61778: ^ task is: None 19110 1726882572.61779: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.61780: done queuing things up, now waiting for results queue to drain 19110 1726882572.61781: results queue empty 19110 1726882572.61782: checking for any_errors_fatal 19110 1726882572.61783: done checking for any_errors_fatal 19110 1726882572.61783: checking for max_fail_percentage 19110 1726882572.61784: done checking for max_fail_percentage 19110 1726882572.61785: checking to see if all hosts have failed and the running result is not ok 19110 1726882572.61786: done checking to see if all hosts have failed 19110 1726882572.61787: getting the next task for host managed_node1 19110 1726882572.61789: done getting next task for host managed_node1 19110 1726882572.61789: ^ task is: None 19110 1726882572.61791: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.62475: in VariableManager get_vars() 19110 1726882572.62490: done with get_vars() 19110 1726882572.62496: in VariableManager get_vars() 19110 1726882572.62507: done with get_vars() 19110 1726882572.62511: variable 'omit' from source: magic vars 19110 1726882572.62539: in VariableManager get_vars() 19110 1726882572.62550: done with get_vars() 19110 1726882572.62575: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 19110 1726882572.62792: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882572.63289: getting the remaining hosts for this loop 19110 1726882572.63290: done getting the remaining hosts for this loop 19110 1726882572.63293: getting the next task for host managed_node1 19110 1726882572.63295: done getting next task for host managed_node1 19110 1726882572.63297: ^ task is: TASK: Gathering Facts 19110 1726882572.63299: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882572.63301: getting variables 19110 1726882572.63302: in VariableManager get_vars() 19110 1726882572.63310: Calling all_inventory to load vars for managed_node1 19110 1726882572.63312: Calling groups_inventory to load vars for managed_node1 19110 1726882572.63315: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882572.63319: Calling all_plugins_play to load vars for managed_node1 19110 1726882572.63322: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882572.63325: Calling groups_plugins_play to load vars for managed_node1 19110 1726882572.65307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882572.68800: done with get_vars() 19110 1726882572.68823: done getting variables 19110 1726882572.68981: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:36:12 -0400 (0:00:00.730) 0:00:29.547 ****** 19110 1726882572.69004: entering _queue_task() for managed_node1/gather_facts 19110 1726882572.69708: worker is 1 (out of 1 available) 19110 1726882572.69718: exiting _queue_task() for managed_node1/gather_facts 19110 1726882572.69729: done queuing things up, now waiting for results queue to drain 19110 1726882572.69730: waiting for pending results... 19110 1726882572.70421: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882572.70796: in run() - task 0e448fcc-3ce9-5372-c19a-0000000003f8 19110 1726882572.70816: variable 'ansible_search_path' from source: unknown 19110 1726882572.70857: calling self._execute() 19110 1726882572.70956: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882572.71049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882572.71090: variable 'omit' from source: magic vars 19110 1726882572.72701: variable 'ansible_distribution_major_version' from source: facts 19110 1726882572.72720: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882572.72730: variable 'omit' from source: magic vars 19110 1726882572.72762: variable 'omit' from source: magic vars 19110 1726882572.72805: variable 'omit' from source: magic vars 19110 1726882572.72847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882572.72889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882572.72916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882572.72937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882572.72951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882572.72986: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882572.72994: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882572.73006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882572.73152: Set connection var ansible_timeout to 10 19110 1726882572.73238: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882572.73247: Set connection var ansible_shell_executable to /bin/sh 19110 1726882572.73253: Set connection var ansible_shell_type to sh 19110 1726882572.73259: Set connection var ansible_connection to ssh 19110 1726882572.73270: Set connection var ansible_pipelining to False 19110 1726882572.73295: variable 'ansible_shell_executable' from source: unknown 19110 1726882572.73335: variable 'ansible_connection' from source: unknown 19110 1726882572.73342: variable 'ansible_module_compression' from source: unknown 19110 1726882572.73348: variable 'ansible_shell_type' from source: unknown 19110 1726882572.73354: variable 'ansible_shell_executable' from source: unknown 19110 1726882572.73360: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882572.73445: variable 'ansible_pipelining' from source: unknown 19110 1726882572.73452: variable 'ansible_timeout' from source: unknown 19110 1726882572.73459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882572.73802: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882572.73819: variable 'omit' from source: magic vars 19110 1726882572.73829: starting attempt loop 19110 1726882572.73836: running the handler 19110 1726882572.73858: variable 'ansible_facts' from source: unknown 19110 1726882572.74002: _low_level_execute_command(): starting 19110 1726882572.74015: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882572.75706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.75710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.75739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.75861: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.75867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.75870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.75922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.76072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.76080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.76187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.77860: stdout chunk (state=3): >>>/root <<< 19110 1726882572.77967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.78034: stderr chunk (state=3): >>><<< 19110 1726882572.78037: stdout chunk (state=3): >>><<< 19110 1726882572.78152: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.78158: _low_level_execute_command(): starting 19110 1726882572.78162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279 `" && echo ansible-tmp-1726882572.7806094-20468-151177719249279="` echo /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279 `" ) && sleep 0' 19110 1726882572.80277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.80280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.80318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882572.80329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.80332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.80381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.80393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.80514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.82398: stdout chunk (state=3): >>>ansible-tmp-1726882572.7806094-20468-151177719249279=/root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279 <<< 19110 1726882572.82512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.82583: stderr chunk (state=3): >>><<< 19110 1726882572.82586: stdout chunk (state=3): >>><<< 19110 1726882572.82870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882572.7806094-20468-151177719249279=/root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.82874: variable 'ansible_module_compression' from source: unknown 19110 1726882572.82876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882572.82879: variable 'ansible_facts' from source: unknown 19110 1726882572.82926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/AnsiballZ_setup.py 19110 1726882572.83095: Sending initial data 19110 1726882572.83098: Sent initial data (154 bytes) 19110 1726882572.84047: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.84155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.84175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.84194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.84238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.84250: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.84267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.84287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.84301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.84318: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.84331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.84346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.84362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.84378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.84390: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.84408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.84487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.84504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.84522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.84648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.86379: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882572.86479: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882572.86571: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpoa5cclcg /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/AnsiballZ_setup.py <<< 19110 1726882572.86668: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882572.89815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.89981: stderr chunk (state=3): >>><<< 19110 1726882572.89984: stdout chunk (state=3): >>><<< 19110 1726882572.89986: done transferring module to remote 19110 1726882572.89988: _low_level_execute_command(): starting 19110 1726882572.89990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/ /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/AnsiballZ_setup.py && sleep 0' 19110 1726882572.90578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.90592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.90607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.90631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.90681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.90694: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.90708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.90726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.90749: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.90768: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.90781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.90796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.90811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.90822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.90834: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.90853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.90931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.90952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.90979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.91314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882572.92878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882572.92945: stderr chunk (state=3): >>><<< 19110 1726882572.92948: stdout chunk (state=3): >>><<< 19110 1726882572.93035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882572.93038: _low_level_execute_command(): starting 19110 1726882572.93040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/AnsiballZ_setup.py && sleep 0' 19110 1726882572.94326: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882572.94338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.94350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.94372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.94436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.94450: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882572.94471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.94490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882572.94509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882572.94527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882572.94542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882572.94559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882572.94578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882572.94590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882572.94603: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882572.94619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882572.94704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882572.94726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882572.94741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882572.94876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.47679: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "13", "epoch": "1726882573", "epoch_int": "1726882573", "date": "2024-09-20", "time": "21:36:13", "iso8601_micro": "2024-09-21T01:36:13.181140Z", "iso8601": "2024-09-21T01:36:13Z", "iso8601_basic": "20240920T213613181140", "iso8601_basic_short": "20240920T213613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_loadavg": {"1m": 0.55, "5m": 0.42, "15m": 0.23}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_interfaces": ["lsr27", "eth0", "peerlsr27", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b9de:9c61:daf5:5f43", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddr<<< 19110 1726882573.47703: stdout chunk (state=3): >>>ess": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::b9de:9c61:daf5:5f43"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2798, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 734, "free": 2798}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239136768, "block_size": 4096, "block_total": 65519355, "block_available": 64511508, "block_used": 1007847, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882573.49283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882573.49359: stderr chunk (state=3): >>><<< 19110 1726882573.49363: stdout chunk (state=3): >>><<< 19110 1726882573.49674: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "13", "epoch": "1726882573", "epoch_int": "1726882573", "date": "2024-09-20", "time": "21:36:13", "iso8601_micro": "2024-09-21T01:36:13.181140Z", "iso8601": "2024-09-21T01:36:13Z", "iso8601_basic": "20240920T213613181140", "iso8601_basic_short": "20240920T213613", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_loadavg": {"1m": 0.55, "5m": 0.42, "15m": 0.23}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_interfaces": ["lsr27", "eth0", "peerlsr27", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "a6:f2:4e:57:42:4a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::b9de:9c61:daf5:5f43", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "3a:77:3d:04:80:fa", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3877:3dff:fe04:80fa", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d", "fe80::b9de:9c61:daf5:5f43", "fe80::3877:3dff:fe04:80fa"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d", "fe80::3877:3dff:fe04:80fa", "fe80::b9de:9c61:daf5:5f43"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2798, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 734, "free": 2798}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 731, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239136768, "block_size": 4096, "block_total": 65519355, "block_available": 64511508, "block_used": 1007847, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882573.50022: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882573.50046: _low_level_execute_command(): starting 19110 1726882573.50055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882572.7806094-20468-151177719249279/ > /dev/null 2>&1 && sleep 0' 19110 1726882573.50709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882573.50725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.50740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.50768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.50812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.50825: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.50839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.50857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.50879: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.50894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.50907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.50922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.50938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.50952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.50967: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.50987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.51067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.51091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.51117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.51242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.53157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882573.53161: stdout chunk (state=3): >>><<< 19110 1726882573.53165: stderr chunk (state=3): >>><<< 19110 1726882573.53769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882573.53773: handler run complete 19110 1726882573.53776: variable 'ansible_facts' from source: unknown 19110 1726882573.53778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.53817: variable 'ansible_facts' from source: unknown 19110 1726882573.53918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.54076: attempt loop complete, returning result 19110 1726882573.54087: _execute() done 19110 1726882573.54094: dumping result to json 19110 1726882573.54136: done dumping result, returning 19110 1726882573.54149: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-0000000003f8] 19110 1726882573.54159: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000003f8 ok: [managed_node1] 19110 1726882573.54994: no more pending results, returning what we have 19110 1726882573.54997: results queue empty 19110 1726882573.54998: checking for any_errors_fatal 19110 1726882573.54999: done checking for any_errors_fatal 19110 1726882573.55000: checking for max_fail_percentage 19110 1726882573.55002: done checking for max_fail_percentage 19110 1726882573.55003: checking to see if all hosts have failed and the running result is not ok 19110 1726882573.55004: done checking to see if all hosts have failed 19110 1726882573.55004: getting the remaining hosts for this loop 19110 1726882573.55006: done getting the remaining hosts for this loop 19110 1726882573.55009: getting the next task for host managed_node1 19110 1726882573.55014: done getting next task for host managed_node1 19110 1726882573.55016: ^ task is: TASK: meta (flush_handlers) 19110 1726882573.55018: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882573.55022: getting variables 19110 1726882573.55023: in VariableManager get_vars() 19110 1726882573.55046: Calling all_inventory to load vars for managed_node1 19110 1726882573.55049: Calling groups_inventory to load vars for managed_node1 19110 1726882573.55052: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882573.55065: Calling all_plugins_play to load vars for managed_node1 19110 1726882573.55069: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882573.55072: Calling groups_plugins_play to load vars for managed_node1 19110 1726882573.56383: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000003f8 19110 1726882573.56386: WORKER PROCESS EXITING 19110 1726882573.56775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.58429: done with get_vars() 19110 1726882573.58455: done getting variables 19110 1726882573.58529: in VariableManager get_vars() 19110 1726882573.58540: Calling all_inventory to load vars for managed_node1 19110 1726882573.58542: Calling groups_inventory to load vars for managed_node1 19110 1726882573.58545: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882573.58550: Calling all_plugins_play to load vars for managed_node1 19110 1726882573.58552: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882573.58560: Calling groups_plugins_play to load vars for managed_node1 19110 1726882573.59853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.61571: done with get_vars() 19110 1726882573.61598: done queuing things up, now waiting for results queue to drain 19110 1726882573.61600: results queue empty 19110 1726882573.61601: checking for any_errors_fatal 19110 1726882573.61605: done checking for any_errors_fatal 19110 1726882573.61606: checking for max_fail_percentage 19110 1726882573.61607: done checking for max_fail_percentage 19110 1726882573.61607: checking to see if all hosts have failed and the running result is not ok 19110 1726882573.61608: done checking to see if all hosts have failed 19110 1726882573.61609: getting the remaining hosts for this loop 19110 1726882573.61610: done getting the remaining hosts for this loop 19110 1726882573.61613: getting the next task for host managed_node1 19110 1726882573.61617: done getting next task for host managed_node1 19110 1726882573.61620: ^ task is: TASK: Include the task 'delete_interface.yml' 19110 1726882573.61621: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882573.61624: getting variables 19110 1726882573.61625: in VariableManager get_vars() 19110 1726882573.61634: Calling all_inventory to load vars for managed_node1 19110 1726882573.61636: Calling groups_inventory to load vars for managed_node1 19110 1726882573.61639: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882573.61644: Calling all_plugins_play to load vars for managed_node1 19110 1726882573.61646: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882573.61649: Calling groups_plugins_play to load vars for managed_node1 19110 1726882573.62845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.64557: done with get_vars() 19110 1726882573.64582: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:36:13 -0400 (0:00:00.956) 0:00:30.503 ****** 19110 1726882573.64661: entering _queue_task() for managed_node1/include_tasks 19110 1726882573.64996: worker is 1 (out of 1 available) 19110 1726882573.65008: exiting _queue_task() for managed_node1/include_tasks 19110 1726882573.65021: done queuing things up, now waiting for results queue to drain 19110 1726882573.65022: waiting for pending results... 19110 1726882573.65304: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 19110 1726882573.65414: in run() - task 0e448fcc-3ce9-5372-c19a-000000000054 19110 1726882573.65435: variable 'ansible_search_path' from source: unknown 19110 1726882573.65483: calling self._execute() 19110 1726882573.65579: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882573.65592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882573.65607: variable 'omit' from source: magic vars 19110 1726882573.65969: variable 'ansible_distribution_major_version' from source: facts 19110 1726882573.65987: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882573.65996: _execute() done 19110 1726882573.66006: dumping result to json 19110 1726882573.66013: done dumping result, returning 19110 1726882573.66020: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0e448fcc-3ce9-5372-c19a-000000000054] 19110 1726882573.66029: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000054 19110 1726882573.66151: no more pending results, returning what we have 19110 1726882573.66156: in VariableManager get_vars() 19110 1726882573.66191: Calling all_inventory to load vars for managed_node1 19110 1726882573.66194: Calling groups_inventory to load vars for managed_node1 19110 1726882573.66198: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882573.66211: Calling all_plugins_play to load vars for managed_node1 19110 1726882573.66215: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882573.66218: Calling groups_plugins_play to load vars for managed_node1 19110 1726882573.67283: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000054 19110 1726882573.67286: WORKER PROCESS EXITING 19110 1726882573.67921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.69592: done with get_vars() 19110 1726882573.69618: variable 'ansible_search_path' from source: unknown 19110 1726882573.69636: we have included files to process 19110 1726882573.69637: generating all_blocks data 19110 1726882573.69638: done generating all_blocks data 19110 1726882573.69639: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19110 1726882573.69640: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19110 1726882573.69643: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19110 1726882573.69882: done processing included file 19110 1726882573.69884: iterating over new_blocks loaded from include file 19110 1726882573.69886: in VariableManager get_vars() 19110 1726882573.69899: done with get_vars() 19110 1726882573.69901: filtering new block on tags 19110 1726882573.69918: done filtering new block on tags 19110 1726882573.69920: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 19110 1726882573.69925: extending task lists for all hosts with included blocks 19110 1726882573.69957: done extending task lists 19110 1726882573.69958: done processing included files 19110 1726882573.69959: results queue empty 19110 1726882573.69960: checking for any_errors_fatal 19110 1726882573.69961: done checking for any_errors_fatal 19110 1726882573.69962: checking for max_fail_percentage 19110 1726882573.69964: done checking for max_fail_percentage 19110 1726882573.69965: checking to see if all hosts have failed and the running result is not ok 19110 1726882573.69966: done checking to see if all hosts have failed 19110 1726882573.69967: getting the remaining hosts for this loop 19110 1726882573.69968: done getting the remaining hosts for this loop 19110 1726882573.69971: getting the next task for host managed_node1 19110 1726882573.69975: done getting next task for host managed_node1 19110 1726882573.69977: ^ task is: TASK: Remove test interface if necessary 19110 1726882573.69980: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882573.69982: getting variables 19110 1726882573.69983: in VariableManager get_vars() 19110 1726882573.69993: Calling all_inventory to load vars for managed_node1 19110 1726882573.69995: Calling groups_inventory to load vars for managed_node1 19110 1726882573.69997: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882573.70002: Calling all_plugins_play to load vars for managed_node1 19110 1726882573.70005: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882573.70008: Calling groups_plugins_play to load vars for managed_node1 19110 1726882573.71317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882573.73014: done with get_vars() 19110 1726882573.73040: done getting variables 19110 1726882573.73090: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:13 -0400 (0:00:00.084) 0:00:30.588 ****** 19110 1726882573.73123: entering _queue_task() for managed_node1/command 19110 1726882573.73457: worker is 1 (out of 1 available) 19110 1726882573.73473: exiting _queue_task() for managed_node1/command 19110 1726882573.73485: done queuing things up, now waiting for results queue to drain 19110 1726882573.73486: waiting for pending results... 19110 1726882573.73765: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 19110 1726882573.73879: in run() - task 0e448fcc-3ce9-5372-c19a-000000000409 19110 1726882573.73899: variable 'ansible_search_path' from source: unknown 19110 1726882573.73906: variable 'ansible_search_path' from source: unknown 19110 1726882573.73950: calling self._execute() 19110 1726882573.74046: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882573.74057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882573.74074: variable 'omit' from source: magic vars 19110 1726882573.74458: variable 'ansible_distribution_major_version' from source: facts 19110 1726882573.74481: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882573.74493: variable 'omit' from source: magic vars 19110 1726882573.74536: variable 'omit' from source: magic vars 19110 1726882573.74636: variable 'interface' from source: set_fact 19110 1726882573.74658: variable 'omit' from source: magic vars 19110 1726882573.74708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882573.74745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882573.74770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882573.74794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882573.74813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882573.74842: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882573.74850: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882573.74855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882573.74949: Set connection var ansible_timeout to 10 19110 1726882573.74969: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882573.74980: Set connection var ansible_shell_executable to /bin/sh 19110 1726882573.74987: Set connection var ansible_shell_type to sh 19110 1726882573.74994: Set connection var ansible_connection to ssh 19110 1726882573.75003: Set connection var ansible_pipelining to False 19110 1726882573.75032: variable 'ansible_shell_executable' from source: unknown 19110 1726882573.75040: variable 'ansible_connection' from source: unknown 19110 1726882573.75045: variable 'ansible_module_compression' from source: unknown 19110 1726882573.75050: variable 'ansible_shell_type' from source: unknown 19110 1726882573.75054: variable 'ansible_shell_executable' from source: unknown 19110 1726882573.75060: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882573.75068: variable 'ansible_pipelining' from source: unknown 19110 1726882573.75074: variable 'ansible_timeout' from source: unknown 19110 1726882573.75080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882573.75215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882573.75236: variable 'omit' from source: magic vars 19110 1726882573.75245: starting attempt loop 19110 1726882573.75252: running the handler 19110 1726882573.75270: _low_level_execute_command(): starting 19110 1726882573.75281: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882573.76103: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.76107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.76134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.76148: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.76169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.76190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.76204: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.76220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.76235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.76254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.76277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.76291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.76303: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.76318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.76406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.76433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.76458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.76598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.78273: stdout chunk (state=3): >>>/root <<< 19110 1726882573.78384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882573.78473: stderr chunk (state=3): >>><<< 19110 1726882573.78486: stdout chunk (state=3): >>><<< 19110 1726882573.78577: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882573.78580: _low_level_execute_command(): starting 19110 1726882573.78583: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726 `" && echo ansible-tmp-1726882573.7851791-20515-8907182077726="` echo /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726 `" ) && sleep 0' 19110 1726882573.79251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882573.79271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.79286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.79303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.79351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.79367: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.79382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.79398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.79409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.79419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.79430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.79448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.79470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.79484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.79496: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.79511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.79594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.79613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.79627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.79750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.81617: stdout chunk (state=3): >>>ansible-tmp-1726882573.7851791-20515-8907182077726=/root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726 <<< 19110 1726882573.81737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882573.81799: stderr chunk (state=3): >>><<< 19110 1726882573.81802: stdout chunk (state=3): >>><<< 19110 1726882573.82081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882573.7851791-20515-8907182077726=/root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882573.82085: variable 'ansible_module_compression' from source: unknown 19110 1726882573.82087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882573.82089: variable 'ansible_facts' from source: unknown 19110 1726882573.82091: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/AnsiballZ_command.py 19110 1726882573.82149: Sending initial data 19110 1726882573.82152: Sent initial data (154 bytes) 19110 1726882573.83058: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882573.83076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.83092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.83110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.83151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.83166: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.83182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.83200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.83212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.83224: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.83236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.83250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.83268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.83282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.83293: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.83307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.83386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.83408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.83424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.83545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.85279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882573.85364: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882573.85463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp7lhn2fd_ /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/AnsiballZ_command.py <<< 19110 1726882573.85551: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882573.86786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882573.87029: stderr chunk (state=3): >>><<< 19110 1726882573.87033: stdout chunk (state=3): >>><<< 19110 1726882573.87036: done transferring module to remote 19110 1726882573.87038: _low_level_execute_command(): starting 19110 1726882573.87043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/ /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/AnsiballZ_command.py && sleep 0' 19110 1726882573.87644: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882573.87658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.87688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.87711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.87753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.87768: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.87783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.87813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.87826: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.87838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.87851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.87868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.87885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.87899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.87917: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.87934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.88011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.88279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.88293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.88397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882573.90200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882573.90203: stdout chunk (state=3): >>><<< 19110 1726882573.90205: stderr chunk (state=3): >>><<< 19110 1726882573.90295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882573.90298: _low_level_execute_command(): starting 19110 1726882573.90301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/AnsiballZ_command.py && sleep 0' 19110 1726882573.90837: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882573.90849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.90862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.90879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.90918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.90928: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882573.90939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.90953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882573.90966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882573.90977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882573.90986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882573.90997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882573.91011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882573.91020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882573.91029: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882573.91042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882573.91118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882573.91138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882573.91151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882573.91278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.05331: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:36:14.041350", "end": "2024-09-20 21:36:14.051471", "delta": "0:00:00.010121", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882574.07101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882574.07105: stdout chunk (state=3): >>><<< 19110 1726882574.07107: stderr chunk (state=3): >>><<< 19110 1726882574.07252: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:36:14.041350", "end": "2024-09-20 21:36:14.051471", "delta": "0:00:00.010121", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882574.07274: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882574.07278: _low_level_execute_command(): starting 19110 1726882574.07280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882573.7851791-20515-8907182077726/ > /dev/null 2>&1 && sleep 0' 19110 1726882574.08439: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882574.08457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.08476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.08494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.08535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.08548: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882574.08567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.08585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882574.08598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882574.08610: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882574.08623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.08637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.08654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.08673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.08685: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882574.08699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.08776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882574.08799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882574.08817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.08944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.11508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882574.11580: stderr chunk (state=3): >>><<< 19110 1726882574.11722: stdout chunk (state=3): >>><<< 19110 1726882574.11769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882574.11772: handler run complete 19110 1726882574.12071: Evaluated conditional (False): False 19110 1726882574.12074: attempt loop complete, returning result 19110 1726882574.12076: _execute() done 19110 1726882574.12078: dumping result to json 19110 1726882574.12080: done dumping result, returning 19110 1726882574.12082: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0e448fcc-3ce9-5372-c19a-000000000409] 19110 1726882574.12084: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000409 19110 1726882574.12153: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000409 19110 1726882574.12159: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.010121", "end": "2024-09-20 21:36:14.051471", "rc": 0, "start": "2024-09-20 21:36:14.041350" } 19110 1726882574.12235: no more pending results, returning what we have 19110 1726882574.12238: results queue empty 19110 1726882574.12239: checking for any_errors_fatal 19110 1726882574.12241: done checking for any_errors_fatal 19110 1726882574.12241: checking for max_fail_percentage 19110 1726882574.12243: done checking for max_fail_percentage 19110 1726882574.12244: checking to see if all hosts have failed and the running result is not ok 19110 1726882574.12245: done checking to see if all hosts have failed 19110 1726882574.12245: getting the remaining hosts for this loop 19110 1726882574.12247: done getting the remaining hosts for this loop 19110 1726882574.12250: getting the next task for host managed_node1 19110 1726882574.12259: done getting next task for host managed_node1 19110 1726882574.12261: ^ task is: TASK: meta (flush_handlers) 19110 1726882574.12267: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882574.12271: getting variables 19110 1726882574.12272: in VariableManager get_vars() 19110 1726882574.12300: Calling all_inventory to load vars for managed_node1 19110 1726882574.12302: Calling groups_inventory to load vars for managed_node1 19110 1726882574.12306: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882574.12316: Calling all_plugins_play to load vars for managed_node1 19110 1726882574.12319: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882574.12322: Calling groups_plugins_play to load vars for managed_node1 19110 1726882574.14024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882574.16118: done with get_vars() 19110 1726882574.16141: done getting variables 19110 1726882574.16223: in VariableManager get_vars() 19110 1726882574.16233: Calling all_inventory to load vars for managed_node1 19110 1726882574.16236: Calling groups_inventory to load vars for managed_node1 19110 1726882574.16238: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882574.16243: Calling all_plugins_play to load vars for managed_node1 19110 1726882574.16246: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882574.16248: Calling groups_plugins_play to load vars for managed_node1 19110 1726882574.17715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882574.21913: done with get_vars() 19110 1726882574.21947: done queuing things up, now waiting for results queue to drain 19110 1726882574.21949: results queue empty 19110 1726882574.21950: checking for any_errors_fatal 19110 1726882574.21953: done checking for any_errors_fatal 19110 1726882574.21954: checking for max_fail_percentage 19110 1726882574.21958: done checking for max_fail_percentage 19110 1726882574.21959: checking to see if all hosts have failed and the running result is not ok 19110 1726882574.21960: done checking to see if all hosts have failed 19110 1726882574.21960: getting the remaining hosts for this loop 19110 1726882574.21961: done getting the remaining hosts for this loop 19110 1726882574.22461: getting the next task for host managed_node1 19110 1726882574.22468: done getting next task for host managed_node1 19110 1726882574.22470: ^ task is: TASK: meta (flush_handlers) 19110 1726882574.22472: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882574.22475: getting variables 19110 1726882574.22476: in VariableManager get_vars() 19110 1726882574.22486: Calling all_inventory to load vars for managed_node1 19110 1726882574.22488: Calling groups_inventory to load vars for managed_node1 19110 1726882574.22490: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882574.22496: Calling all_plugins_play to load vars for managed_node1 19110 1726882574.22498: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882574.22501: Calling groups_plugins_play to load vars for managed_node1 19110 1726882574.24676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882574.26714: done with get_vars() 19110 1726882574.26741: done getting variables 19110 1726882574.26799: in VariableManager get_vars() 19110 1726882574.26815: Calling all_inventory to load vars for managed_node1 19110 1726882574.26818: Calling groups_inventory to load vars for managed_node1 19110 1726882574.26820: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882574.26825: Calling all_plugins_play to load vars for managed_node1 19110 1726882574.26828: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882574.26831: Calling groups_plugins_play to load vars for managed_node1 19110 1726882574.28189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882574.30301: done with get_vars() 19110 1726882574.30327: done queuing things up, now waiting for results queue to drain 19110 1726882574.30330: results queue empty 19110 1726882574.30330: checking for any_errors_fatal 19110 1726882574.30332: done checking for any_errors_fatal 19110 1726882574.30333: checking for max_fail_percentage 19110 1726882574.30334: done checking for max_fail_percentage 19110 1726882574.30334: checking to see if all hosts have failed and the running result is not ok 19110 1726882574.30335: done checking to see if all hosts have failed 19110 1726882574.30336: getting the remaining hosts for this loop 19110 1726882574.30337: done getting the remaining hosts for this loop 19110 1726882574.30340: getting the next task for host managed_node1 19110 1726882574.30343: done getting next task for host managed_node1 19110 1726882574.30344: ^ task is: None 19110 1726882574.30345: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882574.30346: done queuing things up, now waiting for results queue to drain 19110 1726882574.30347: results queue empty 19110 1726882574.30348: checking for any_errors_fatal 19110 1726882574.30349: done checking for any_errors_fatal 19110 1726882574.30349: checking for max_fail_percentage 19110 1726882574.30350: done checking for max_fail_percentage 19110 1726882574.30351: checking to see if all hosts have failed and the running result is not ok 19110 1726882574.30352: done checking to see if all hosts have failed 19110 1726882574.30353: getting the next task for host managed_node1 19110 1726882574.30358: done getting next task for host managed_node1 19110 1726882574.30359: ^ task is: None 19110 1726882574.30360: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882574.30406: in VariableManager get_vars() 19110 1726882574.30428: done with get_vars() 19110 1726882574.30433: in VariableManager get_vars() 19110 1726882574.30446: done with get_vars() 19110 1726882574.30450: variable 'omit' from source: magic vars 19110 1726882574.30580: variable 'profile' from source: play vars 19110 1726882574.30685: in VariableManager get_vars() 19110 1726882574.30700: done with get_vars() 19110 1726882574.30727: variable 'omit' from source: magic vars 19110 1726882574.30794: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 19110 1726882574.31463: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882574.31491: getting the remaining hosts for this loop 19110 1726882574.31492: done getting the remaining hosts for this loop 19110 1726882574.31495: getting the next task for host managed_node1 19110 1726882574.31497: done getting next task for host managed_node1 19110 1726882574.31499: ^ task is: TASK: Gathering Facts 19110 1726882574.31500: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882574.31502: getting variables 19110 1726882574.31503: in VariableManager get_vars() 19110 1726882574.31514: Calling all_inventory to load vars for managed_node1 19110 1726882574.31516: Calling groups_inventory to load vars for managed_node1 19110 1726882574.31518: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882574.31523: Calling all_plugins_play to load vars for managed_node1 19110 1726882574.31525: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882574.31528: Calling groups_plugins_play to load vars for managed_node1 19110 1726882574.33822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882574.35539: done with get_vars() 19110 1726882574.35563: done getting variables 19110 1726882574.35609: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:36:14 -0400 (0:00:00.625) 0:00:31.213 ****** 19110 1726882574.35635: entering _queue_task() for managed_node1/gather_facts 19110 1726882574.35968: worker is 1 (out of 1 available) 19110 1726882574.35979: exiting _queue_task() for managed_node1/gather_facts 19110 1726882574.35990: done queuing things up, now waiting for results queue to drain 19110 1726882574.35992: waiting for pending results... 19110 1726882574.36275: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882574.36380: in run() - task 0e448fcc-3ce9-5372-c19a-000000000417 19110 1726882574.36401: variable 'ansible_search_path' from source: unknown 19110 1726882574.36447: calling self._execute() 19110 1726882574.36545: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882574.36559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882574.36576: variable 'omit' from source: magic vars 19110 1726882574.36948: variable 'ansible_distribution_major_version' from source: facts 19110 1726882574.36972: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882574.36987: variable 'omit' from source: magic vars 19110 1726882574.37017: variable 'omit' from source: magic vars 19110 1726882574.37060: variable 'omit' from source: magic vars 19110 1726882574.37111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882574.37150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882574.37181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882574.37207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882574.37223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882574.37258: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882574.37270: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882574.37278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882574.37391: Set connection var ansible_timeout to 10 19110 1726882574.37414: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882574.37425: Set connection var ansible_shell_executable to /bin/sh 19110 1726882574.37431: Set connection var ansible_shell_type to sh 19110 1726882574.37438: Set connection var ansible_connection to ssh 19110 1726882574.37447: Set connection var ansible_pipelining to False 19110 1726882574.37477: variable 'ansible_shell_executable' from source: unknown 19110 1726882574.37485: variable 'ansible_connection' from source: unknown 19110 1726882574.37493: variable 'ansible_module_compression' from source: unknown 19110 1726882574.37500: variable 'ansible_shell_type' from source: unknown 19110 1726882574.37508: variable 'ansible_shell_executable' from source: unknown 19110 1726882574.37516: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882574.37528: variable 'ansible_pipelining' from source: unknown 19110 1726882574.37534: variable 'ansible_timeout' from source: unknown 19110 1726882574.37541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882574.37727: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882574.37748: variable 'omit' from source: magic vars 19110 1726882574.37761: starting attempt loop 19110 1726882574.37771: running the handler 19110 1726882574.37792: variable 'ansible_facts' from source: unknown 19110 1726882574.37815: _low_level_execute_command(): starting 19110 1726882574.37827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882574.38589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882574.38603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.38621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.38638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.38686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.38697: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882574.38709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.38728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882574.38738: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882574.38747: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882574.38759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.38776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.38793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.38805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.38816: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882574.38832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.38911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882574.38934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882574.38953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.39091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.40770: stdout chunk (state=3): >>>/root <<< 19110 1726882574.40950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882574.40953: stdout chunk (state=3): >>><<< 19110 1726882574.40958: stderr chunk (state=3): >>><<< 19110 1726882574.41067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882574.41072: _low_level_execute_command(): starting 19110 1726882574.41075: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917 `" && echo ansible-tmp-1726882574.4097815-20546-21674809863917="` echo /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917 `" ) && sleep 0' 19110 1726882574.42187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882574.42201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.42216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.42239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.42285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.42298: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882574.42312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.42331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882574.42346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882574.42357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882574.42373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.42387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.42403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.42414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882574.42425: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882574.42439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.42516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882574.42540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882574.42559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.42694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.44589: stdout chunk (state=3): >>>ansible-tmp-1726882574.4097815-20546-21674809863917=/root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917 <<< 19110 1726882574.44770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882574.44773: stdout chunk (state=3): >>><<< 19110 1726882574.44787: stderr chunk (state=3): >>><<< 19110 1726882574.44972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882574.4097815-20546-21674809863917=/root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882574.44975: variable 'ansible_module_compression' from source: unknown 19110 1726882574.44978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882574.44980: variable 'ansible_facts' from source: unknown 19110 1726882574.45140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/AnsiballZ_setup.py 19110 1726882574.45810: Sending initial data 19110 1726882574.45813: Sent initial data (153 bytes) 19110 1726882574.47504: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.47508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.47544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.47548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882574.47551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.47624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882574.47687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.47897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.49677: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19110 1726882574.49681: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882574.49761: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882574.49867: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpd3ph1tr6 /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/AnsiballZ_setup.py <<< 19110 1726882574.49955: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882574.53290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882574.53960: stderr chunk (state=3): >>><<< 19110 1726882574.53967: stdout chunk (state=3): >>><<< 19110 1726882574.53970: done transferring module to remote 19110 1726882574.53972: _low_level_execute_command(): starting 19110 1726882574.53975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/ /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/AnsiballZ_setup.py && sleep 0' 19110 1726882574.54580: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.54584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.54622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882574.54625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.54629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.54695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882574.54698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.54806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882574.56599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882574.56659: stderr chunk (state=3): >>><<< 19110 1726882574.56666: stdout chunk (state=3): >>><<< 19110 1726882574.56768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882574.56771: _low_level_execute_command(): starting 19110 1726882574.56774: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/AnsiballZ_setup.py && sleep 0' 19110 1726882574.57997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.58000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882574.58036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882574.58039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882574.58042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882574.58044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882574.58306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882574.58310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882574.58424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.09473: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.55, "5m": 0.42, "15m": 0.23}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 736, "free": 2796}, "nocache": {"free": 3258, "used": 274}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 732, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239157248, "block_size": 4096, "block_total": 65519355, "block_available": 64511513, "block_used": 1007842, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.090570Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615090570", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882575.11144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882575.11222: stderr chunk (state=3): >>><<< 19110 1726882575.11226: stdout chunk (state=3): >>><<< 19110 1726882575.11704: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.55, "5m": 0.42, "15m": 0.23}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 736, "free": 2796}, "nocache": {"free": 3258, "used": 274}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 732, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239157248, "block_size": 4096, "block_total": 65519355, "block_available": 64511513, "block_used": 1007842, "inode_total": 131071472, "inode_available": 130998698, "inode_used": 72774, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.090570Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615090570", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882575.11714: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882575.11718: _low_level_execute_command(): starting 19110 1726882575.11720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882574.4097815-20546-21674809863917/ > /dev/null 2>&1 && sleep 0' 19110 1726882575.12301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.12316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.12331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.12348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.12390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.12402: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.12415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.12432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.12443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.12453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.12468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.12483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.12511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.12521: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.12533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.12611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.12632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.12647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.12774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.14691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882575.14694: stdout chunk (state=3): >>><<< 19110 1726882575.14697: stderr chunk (state=3): >>><<< 19110 1726882575.14968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882575.14972: handler run complete 19110 1726882575.14974: variable 'ansible_facts' from source: unknown 19110 1726882575.14976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.15274: variable 'ansible_facts' from source: unknown 19110 1726882575.15372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.15532: attempt loop complete, returning result 19110 1726882575.15540: _execute() done 19110 1726882575.15545: dumping result to json 19110 1726882575.15574: done dumping result, returning 19110 1726882575.15585: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-000000000417] 19110 1726882575.15592: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000417 ok: [managed_node1] 19110 1726882575.16219: no more pending results, returning what we have 19110 1726882575.16222: results queue empty 19110 1726882575.16224: checking for any_errors_fatal 19110 1726882575.16225: done checking for any_errors_fatal 19110 1726882575.16226: checking for max_fail_percentage 19110 1726882575.16228: done checking for max_fail_percentage 19110 1726882575.16229: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.16229: done checking to see if all hosts have failed 19110 1726882575.16230: getting the remaining hosts for this loop 19110 1726882575.16232: done getting the remaining hosts for this loop 19110 1726882575.16235: getting the next task for host managed_node1 19110 1726882575.16242: done getting next task for host managed_node1 19110 1726882575.16244: ^ task is: TASK: meta (flush_handlers) 19110 1726882575.16246: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.16250: getting variables 19110 1726882575.16251: in VariableManager get_vars() 19110 1726882575.16293: Calling all_inventory to load vars for managed_node1 19110 1726882575.16296: Calling groups_inventory to load vars for managed_node1 19110 1726882575.16298: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.16310: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.16313: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.16317: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.17171: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000417 19110 1726882575.17175: WORKER PROCESS EXITING 19110 1726882575.18420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.20168: done with get_vars() 19110 1726882575.20191: done getting variables 19110 1726882575.20253: in VariableManager get_vars() 19110 1726882575.20270: Calling all_inventory to load vars for managed_node1 19110 1726882575.20272: Calling groups_inventory to load vars for managed_node1 19110 1726882575.20274: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.20279: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.20281: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.20287: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.21520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.28421: done with get_vars() 19110 1726882575.28453: done queuing things up, now waiting for results queue to drain 19110 1726882575.28458: results queue empty 19110 1726882575.28459: checking for any_errors_fatal 19110 1726882575.28466: done checking for any_errors_fatal 19110 1726882575.28467: checking for max_fail_percentage 19110 1726882575.28468: done checking for max_fail_percentage 19110 1726882575.28469: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.28470: done checking to see if all hosts have failed 19110 1726882575.28470: getting the remaining hosts for this loop 19110 1726882575.28471: done getting the remaining hosts for this loop 19110 1726882575.28474: getting the next task for host managed_node1 19110 1726882575.28478: done getting next task for host managed_node1 19110 1726882575.28481: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882575.28483: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.28492: getting variables 19110 1726882575.28494: in VariableManager get_vars() 19110 1726882575.28508: Calling all_inventory to load vars for managed_node1 19110 1726882575.28510: Calling groups_inventory to load vars for managed_node1 19110 1726882575.28512: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.28517: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.28519: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.28522: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.30559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.32523: done with get_vars() 19110 1726882575.32548: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:15 -0400 (0:00:00.969) 0:00:32.183 ****** 19110 1726882575.32623: entering _queue_task() for managed_node1/include_tasks 19110 1726882575.33012: worker is 1 (out of 1 available) 19110 1726882575.33024: exiting _queue_task() for managed_node1/include_tasks 19110 1726882575.33034: done queuing things up, now waiting for results queue to drain 19110 1726882575.33035: waiting for pending results... 19110 1726882575.33438: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19110 1726882575.33577: in run() - task 0e448fcc-3ce9-5372-c19a-00000000005c 19110 1726882575.33603: variable 'ansible_search_path' from source: unknown 19110 1726882575.33610: variable 'ansible_search_path' from source: unknown 19110 1726882575.33653: calling self._execute() 19110 1726882575.33762: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.33777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.33795: variable 'omit' from source: magic vars 19110 1726882575.34194: variable 'ansible_distribution_major_version' from source: facts 19110 1726882575.34212: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882575.34224: _execute() done 19110 1726882575.34234: dumping result to json 19110 1726882575.34245: done dumping result, returning 19110 1726882575.34258: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-5372-c19a-00000000005c] 19110 1726882575.34273: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005c 19110 1726882575.34418: no more pending results, returning what we have 19110 1726882575.34424: in VariableManager get_vars() 19110 1726882575.34475: Calling all_inventory to load vars for managed_node1 19110 1726882575.34479: Calling groups_inventory to load vars for managed_node1 19110 1726882575.34482: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.34496: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.34499: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.34502: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.35583: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005c 19110 1726882575.35586: WORKER PROCESS EXITING 19110 1726882575.36351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.38109: done with get_vars() 19110 1726882575.38128: variable 'ansible_search_path' from source: unknown 19110 1726882575.38129: variable 'ansible_search_path' from source: unknown 19110 1726882575.38158: we have included files to process 19110 1726882575.38159: generating all_blocks data 19110 1726882575.38161: done generating all_blocks data 19110 1726882575.38161: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882575.38162: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882575.38166: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19110 1726882575.38716: done processing included file 19110 1726882575.38718: iterating over new_blocks loaded from include file 19110 1726882575.38719: in VariableManager get_vars() 19110 1726882575.38738: done with get_vars() 19110 1726882575.38739: filtering new block on tags 19110 1726882575.38752: done filtering new block on tags 19110 1726882575.38757: in VariableManager get_vars() 19110 1726882575.38775: done with get_vars() 19110 1726882575.38776: filtering new block on tags 19110 1726882575.38792: done filtering new block on tags 19110 1726882575.38794: in VariableManager get_vars() 19110 1726882575.38810: done with get_vars() 19110 1726882575.38811: filtering new block on tags 19110 1726882575.38826: done filtering new block on tags 19110 1726882575.38828: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 19110 1726882575.38833: extending task lists for all hosts with included blocks 19110 1726882575.39229: done extending task lists 19110 1726882575.39230: done processing included files 19110 1726882575.39231: results queue empty 19110 1726882575.39232: checking for any_errors_fatal 19110 1726882575.39233: done checking for any_errors_fatal 19110 1726882575.39234: checking for max_fail_percentage 19110 1726882575.39235: done checking for max_fail_percentage 19110 1726882575.39236: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.39237: done checking to see if all hosts have failed 19110 1726882575.39238: getting the remaining hosts for this loop 19110 1726882575.39239: done getting the remaining hosts for this loop 19110 1726882575.39241: getting the next task for host managed_node1 19110 1726882575.39244: done getting next task for host managed_node1 19110 1726882575.39247: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882575.39249: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.39261: getting variables 19110 1726882575.39262: in VariableManager get_vars() 19110 1726882575.39277: Calling all_inventory to load vars for managed_node1 19110 1726882575.39280: Calling groups_inventory to load vars for managed_node1 19110 1726882575.39282: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.39287: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.39290: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.39293: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.40554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.42624: done with get_vars() 19110 1726882575.42648: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:15 -0400 (0:00:00.100) 0:00:32.284 ****** 19110 1726882575.42725: entering _queue_task() for managed_node1/setup 19110 1726882575.43060: worker is 1 (out of 1 available) 19110 1726882575.43075: exiting _queue_task() for managed_node1/setup 19110 1726882575.43086: done queuing things up, now waiting for results queue to drain 19110 1726882575.43088: waiting for pending results... 19110 1726882575.43390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19110 1726882575.43535: in run() - task 0e448fcc-3ce9-5372-c19a-000000000458 19110 1726882575.43552: variable 'ansible_search_path' from source: unknown 19110 1726882575.43561: variable 'ansible_search_path' from source: unknown 19110 1726882575.43599: calling self._execute() 19110 1726882575.43692: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.43702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.43713: variable 'omit' from source: magic vars 19110 1726882575.44082: variable 'ansible_distribution_major_version' from source: facts 19110 1726882575.44098: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882575.44308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882575.46747: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882575.46836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882575.46884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882575.46930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882575.46965: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882575.47061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882575.47100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882575.47137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882575.47189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882575.47209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882575.47273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882575.47302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882575.47337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882575.47389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882575.47408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882575.47586: variable '__network_required_facts' from source: role '' defaults 19110 1726882575.47601: variable 'ansible_facts' from source: unknown 19110 1726882575.48418: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19110 1726882575.48429: when evaluation is False, skipping this task 19110 1726882575.48437: _execute() done 19110 1726882575.48445: dumping result to json 19110 1726882575.48453: done dumping result, returning 19110 1726882575.48471: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-5372-c19a-000000000458] 19110 1726882575.48483: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000458 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882575.48631: no more pending results, returning what we have 19110 1726882575.48636: results queue empty 19110 1726882575.48637: checking for any_errors_fatal 19110 1726882575.48638: done checking for any_errors_fatal 19110 1726882575.48639: checking for max_fail_percentage 19110 1726882575.48641: done checking for max_fail_percentage 19110 1726882575.48642: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.48643: done checking to see if all hosts have failed 19110 1726882575.48644: getting the remaining hosts for this loop 19110 1726882575.48645: done getting the remaining hosts for this loop 19110 1726882575.48649: getting the next task for host managed_node1 19110 1726882575.48662: done getting next task for host managed_node1 19110 1726882575.48670: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882575.48674: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.48687: getting variables 19110 1726882575.48689: in VariableManager get_vars() 19110 1726882575.48730: Calling all_inventory to load vars for managed_node1 19110 1726882575.48734: Calling groups_inventory to load vars for managed_node1 19110 1726882575.48737: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.48748: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.48751: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.48754: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.49781: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000458 19110 1726882575.49785: WORKER PROCESS EXITING 19110 1726882575.50565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.52302: done with get_vars() 19110 1726882575.52328: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:15 -0400 (0:00:00.097) 0:00:32.381 ****** 19110 1726882575.52433: entering _queue_task() for managed_node1/stat 19110 1726882575.52745: worker is 1 (out of 1 available) 19110 1726882575.52761: exiting _queue_task() for managed_node1/stat 19110 1726882575.52774: done queuing things up, now waiting for results queue to drain 19110 1726882575.52775: waiting for pending results... 19110 1726882575.53061: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 19110 1726882575.53213: in run() - task 0e448fcc-3ce9-5372-c19a-00000000045a 19110 1726882575.53235: variable 'ansible_search_path' from source: unknown 19110 1726882575.53243: variable 'ansible_search_path' from source: unknown 19110 1726882575.53287: calling self._execute() 19110 1726882575.53389: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.53400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.53414: variable 'omit' from source: magic vars 19110 1726882575.53806: variable 'ansible_distribution_major_version' from source: facts 19110 1726882575.53826: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882575.54008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882575.54289: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882575.54340: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882575.54384: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882575.54428: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882575.54540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882575.54573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882575.54601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882575.54631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882575.54724: variable '__network_is_ostree' from source: set_fact 19110 1726882575.54737: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882575.54745: when evaluation is False, skipping this task 19110 1726882575.54751: _execute() done 19110 1726882575.54759: dumping result to json 19110 1726882575.54768: done dumping result, returning 19110 1726882575.54778: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-5372-c19a-00000000045a] 19110 1726882575.54787: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045a skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882575.54921: no more pending results, returning what we have 19110 1726882575.54926: results queue empty 19110 1726882575.54927: checking for any_errors_fatal 19110 1726882575.54933: done checking for any_errors_fatal 19110 1726882575.54934: checking for max_fail_percentage 19110 1726882575.54936: done checking for max_fail_percentage 19110 1726882575.54937: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.54937: done checking to see if all hosts have failed 19110 1726882575.54938: getting the remaining hosts for this loop 19110 1726882575.54940: done getting the remaining hosts for this loop 19110 1726882575.54944: getting the next task for host managed_node1 19110 1726882575.54951: done getting next task for host managed_node1 19110 1726882575.54954: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882575.54960: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.54975: getting variables 19110 1726882575.54977: in VariableManager get_vars() 19110 1726882575.55015: Calling all_inventory to load vars for managed_node1 19110 1726882575.55017: Calling groups_inventory to load vars for managed_node1 19110 1726882575.55020: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.55030: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.55032: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.55035: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.56082: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045a 19110 1726882575.56085: WORKER PROCESS EXITING 19110 1726882575.56869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.58592: done with get_vars() 19110 1726882575.58618: done getting variables 19110 1726882575.58685: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:15 -0400 (0:00:00.062) 0:00:32.444 ****** 19110 1726882575.58719: entering _queue_task() for managed_node1/set_fact 19110 1726882575.59029: worker is 1 (out of 1 available) 19110 1726882575.59040: exiting _queue_task() for managed_node1/set_fact 19110 1726882575.59051: done queuing things up, now waiting for results queue to drain 19110 1726882575.59052: waiting for pending results... 19110 1726882575.59334: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19110 1726882575.59476: in run() - task 0e448fcc-3ce9-5372-c19a-00000000045b 19110 1726882575.59500: variable 'ansible_search_path' from source: unknown 19110 1726882575.59508: variable 'ansible_search_path' from source: unknown 19110 1726882575.59550: calling self._execute() 19110 1726882575.59650: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.59667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.59682: variable 'omit' from source: magic vars 19110 1726882575.60068: variable 'ansible_distribution_major_version' from source: facts 19110 1726882575.60085: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882575.60239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882575.60506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882575.60553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882575.60600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882575.60636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882575.60743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882575.60774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882575.60806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882575.60837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882575.60932: variable '__network_is_ostree' from source: set_fact 19110 1726882575.60944: Evaluated conditional (not __network_is_ostree is defined): False 19110 1726882575.60951: when evaluation is False, skipping this task 19110 1726882575.60962: _execute() done 19110 1726882575.60972: dumping result to json 19110 1726882575.60979: done dumping result, returning 19110 1726882575.60989: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-5372-c19a-00000000045b] 19110 1726882575.61000: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045b 19110 1726882575.61106: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045b 19110 1726882575.61115: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19110 1726882575.61169: no more pending results, returning what we have 19110 1726882575.61173: results queue empty 19110 1726882575.61175: checking for any_errors_fatal 19110 1726882575.61180: done checking for any_errors_fatal 19110 1726882575.61181: checking for max_fail_percentage 19110 1726882575.61183: done checking for max_fail_percentage 19110 1726882575.61184: checking to see if all hosts have failed and the running result is not ok 19110 1726882575.61185: done checking to see if all hosts have failed 19110 1726882575.61186: getting the remaining hosts for this loop 19110 1726882575.61187: done getting the remaining hosts for this loop 19110 1726882575.61191: getting the next task for host managed_node1 19110 1726882575.61200: done getting next task for host managed_node1 19110 1726882575.61204: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882575.61207: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882575.61221: getting variables 19110 1726882575.61223: in VariableManager get_vars() 19110 1726882575.61266: Calling all_inventory to load vars for managed_node1 19110 1726882575.61269: Calling groups_inventory to load vars for managed_node1 19110 1726882575.61272: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882575.61283: Calling all_plugins_play to load vars for managed_node1 19110 1726882575.61286: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882575.61290: Calling groups_plugins_play to load vars for managed_node1 19110 1726882575.63016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882575.64770: done with get_vars() 19110 1726882575.64796: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:15 -0400 (0:00:00.061) 0:00:32.506 ****** 19110 1726882575.64896: entering _queue_task() for managed_node1/service_facts 19110 1726882575.65185: worker is 1 (out of 1 available) 19110 1726882575.65197: exiting _queue_task() for managed_node1/service_facts 19110 1726882575.65208: done queuing things up, now waiting for results queue to drain 19110 1726882575.65210: waiting for pending results... 19110 1726882575.65490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 19110 1726882575.65616: in run() - task 0e448fcc-3ce9-5372-c19a-00000000045d 19110 1726882575.65633: variable 'ansible_search_path' from source: unknown 19110 1726882575.65639: variable 'ansible_search_path' from source: unknown 19110 1726882575.65684: calling self._execute() 19110 1726882575.65776: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.65788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.65800: variable 'omit' from source: magic vars 19110 1726882575.66166: variable 'ansible_distribution_major_version' from source: facts 19110 1726882575.66182: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882575.66192: variable 'omit' from source: magic vars 19110 1726882575.66259: variable 'omit' from source: magic vars 19110 1726882575.66304: variable 'omit' from source: magic vars 19110 1726882575.66350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882575.66393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882575.66422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882575.66445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882575.66465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882575.66498: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882575.66508: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.66516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.66625: Set connection var ansible_timeout to 10 19110 1726882575.66647: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882575.66661: Set connection var ansible_shell_executable to /bin/sh 19110 1726882575.66673: Set connection var ansible_shell_type to sh 19110 1726882575.66680: Set connection var ansible_connection to ssh 19110 1726882575.66690: Set connection var ansible_pipelining to False 19110 1726882575.66715: variable 'ansible_shell_executable' from source: unknown 19110 1726882575.66724: variable 'ansible_connection' from source: unknown 19110 1726882575.66732: variable 'ansible_module_compression' from source: unknown 19110 1726882575.66743: variable 'ansible_shell_type' from source: unknown 19110 1726882575.66750: variable 'ansible_shell_executable' from source: unknown 19110 1726882575.66760: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882575.66772: variable 'ansible_pipelining' from source: unknown 19110 1726882575.66779: variable 'ansible_timeout' from source: unknown 19110 1726882575.66787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882575.66989: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882575.67004: variable 'omit' from source: magic vars 19110 1726882575.67013: starting attempt loop 19110 1726882575.67020: running the handler 19110 1726882575.67038: _low_level_execute_command(): starting 19110 1726882575.67057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882575.67847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.67868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.67883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.67902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.67950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.67969: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.67984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.68004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.68017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.68028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.68042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.68064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.68083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.68097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.68109: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.68124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.68197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.68213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.68227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.68366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.70042: stdout chunk (state=3): >>>/root <<< 19110 1726882575.70145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882575.70219: stderr chunk (state=3): >>><<< 19110 1726882575.70221: stdout chunk (state=3): >>><<< 19110 1726882575.70318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882575.70321: _low_level_execute_command(): starting 19110 1726882575.70324: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249 `" && echo ansible-tmp-1726882575.7023697-20599-205385601716249="` echo /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249 `" ) && sleep 0' 19110 1726882575.70872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.70885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.70896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.70913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.70950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.70960: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.70975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.70991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.71001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.71011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.71023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.71037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.71053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.71069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.71082: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.71096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.71169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.71188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.71203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.71333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.73241: stdout chunk (state=3): >>>ansible-tmp-1726882575.7023697-20599-205385601716249=/root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249 <<< 19110 1726882575.73353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882575.73427: stderr chunk (state=3): >>><<< 19110 1726882575.73430: stdout chunk (state=3): >>><<< 19110 1726882575.73754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882575.7023697-20599-205385601716249=/root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882575.73761: variable 'ansible_module_compression' from source: unknown 19110 1726882575.73766: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19110 1726882575.73769: variable 'ansible_facts' from source: unknown 19110 1726882575.73771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/AnsiballZ_service_facts.py 19110 1726882575.73832: Sending initial data 19110 1726882575.73835: Sent initial data (162 bytes) 19110 1726882575.74781: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.74795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.74807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.74822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.74867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.74882: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.74896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.74912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.74923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.74932: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.74942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.74953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.74977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.74988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.74999: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.75011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.75095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.75111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.75124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.75246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.77044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882575.77134: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882575.77229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp9eccijjr /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/AnsiballZ_service_facts.py <<< 19110 1726882575.77320: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882575.78675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882575.78869: stderr chunk (state=3): >>><<< 19110 1726882575.78873: stdout chunk (state=3): >>><<< 19110 1726882575.78875: done transferring module to remote 19110 1726882575.78881: _low_level_execute_command(): starting 19110 1726882575.78883: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/ /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/AnsiballZ_service_facts.py && sleep 0' 19110 1726882575.79488: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.79500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.79511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.79525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.79572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.79585: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.79600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.79619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.79632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.79645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.79662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.79681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.79697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.79709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.79720: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.79734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.79814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.79836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.79852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.79987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882575.81883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882575.81886: stdout chunk (state=3): >>><<< 19110 1726882575.81889: stderr chunk (state=3): >>><<< 19110 1726882575.81978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882575.81982: _low_level_execute_command(): starting 19110 1726882575.81985: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/AnsiballZ_service_facts.py && sleep 0' 19110 1726882575.82533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882575.82548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.82566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.82585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.82626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.82640: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882575.82654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.82677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882575.82691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882575.82702: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882575.82715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882575.82731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882575.82747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882575.82760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882575.82775: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882575.82791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882575.82870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882575.82893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882575.82910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882575.83041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.14426: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 19110 1726882577.14447: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 19110 1726882577.14459: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 19110 1726882577.14493: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper<<< 19110 1726882577.14501: stdout chunk (state=3): >>>-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19110 1726882577.15703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882577.15762: stderr chunk (state=3): >>><<< 19110 1726882577.15767: stdout chunk (state=3): >>><<< 19110 1726882577.15795: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882577.16394: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882577.16402: _low_level_execute_command(): starting 19110 1726882577.16411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882575.7023697-20599-205385601716249/ > /dev/null 2>&1 && sleep 0' 19110 1726882577.16875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.16878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.16913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.16917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.16919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.16973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.16977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.17077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.18838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882577.18887: stderr chunk (state=3): >>><<< 19110 1726882577.18890: stdout chunk (state=3): >>><<< 19110 1726882577.18902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882577.18907: handler run complete 19110 1726882577.19006: variable 'ansible_facts' from source: unknown 19110 1726882577.19113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882577.19567: variable 'ansible_facts' from source: unknown 19110 1726882577.19648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882577.19757: attempt loop complete, returning result 19110 1726882577.19762: _execute() done 19110 1726882577.19764: dumping result to json 19110 1726882577.19798: done dumping result, returning 19110 1726882577.19807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-5372-c19a-00000000045d] 19110 1726882577.19812: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045d 19110 1726882577.20487: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045d 19110 1726882577.20490: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882577.20536: no more pending results, returning what we have 19110 1726882577.20538: results queue empty 19110 1726882577.20539: checking for any_errors_fatal 19110 1726882577.20541: done checking for any_errors_fatal 19110 1726882577.20542: checking for max_fail_percentage 19110 1726882577.20543: done checking for max_fail_percentage 19110 1726882577.20543: checking to see if all hosts have failed and the running result is not ok 19110 1726882577.20544: done checking to see if all hosts have failed 19110 1726882577.20544: getting the remaining hosts for this loop 19110 1726882577.20545: done getting the remaining hosts for this loop 19110 1726882577.20547: getting the next task for host managed_node1 19110 1726882577.20551: done getting next task for host managed_node1 19110 1726882577.20553: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882577.20555: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882577.20561: getting variables 19110 1726882577.20562: in VariableManager get_vars() 19110 1726882577.20586: Calling all_inventory to load vars for managed_node1 19110 1726882577.20588: Calling groups_inventory to load vars for managed_node1 19110 1726882577.20589: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882577.20596: Calling all_plugins_play to load vars for managed_node1 19110 1726882577.20598: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882577.20599: Calling groups_plugins_play to load vars for managed_node1 19110 1726882577.21734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882577.23734: done with get_vars() 19110 1726882577.23760: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:17 -0400 (0:00:01.589) 0:00:34.095 ****** 19110 1726882577.23868: entering _queue_task() for managed_node1/package_facts 19110 1726882577.24180: worker is 1 (out of 1 available) 19110 1726882577.24213: exiting _queue_task() for managed_node1/package_facts 19110 1726882577.24233: done queuing things up, now waiting for results queue to drain 19110 1726882577.24235: waiting for pending results... 19110 1726882577.24615: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 19110 1726882577.24703: in run() - task 0e448fcc-3ce9-5372-c19a-00000000045e 19110 1726882577.24715: variable 'ansible_search_path' from source: unknown 19110 1726882577.24718: variable 'ansible_search_path' from source: unknown 19110 1726882577.24746: calling self._execute() 19110 1726882577.24823: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882577.24827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882577.24834: variable 'omit' from source: magic vars 19110 1726882577.25113: variable 'ansible_distribution_major_version' from source: facts 19110 1726882577.25122: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882577.25128: variable 'omit' from source: magic vars 19110 1726882577.25176: variable 'omit' from source: magic vars 19110 1726882577.25203: variable 'omit' from source: magic vars 19110 1726882577.25240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882577.25268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882577.25284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882577.25297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882577.25307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882577.25332: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882577.25336: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882577.25338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882577.25408: Set connection var ansible_timeout to 10 19110 1726882577.25418: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882577.25421: Set connection var ansible_shell_executable to /bin/sh 19110 1726882577.25424: Set connection var ansible_shell_type to sh 19110 1726882577.25426: Set connection var ansible_connection to ssh 19110 1726882577.25431: Set connection var ansible_pipelining to False 19110 1726882577.25452: variable 'ansible_shell_executable' from source: unknown 19110 1726882577.25457: variable 'ansible_connection' from source: unknown 19110 1726882577.25460: variable 'ansible_module_compression' from source: unknown 19110 1726882577.25463: variable 'ansible_shell_type' from source: unknown 19110 1726882577.25467: variable 'ansible_shell_executable' from source: unknown 19110 1726882577.25470: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882577.25476: variable 'ansible_pipelining' from source: unknown 19110 1726882577.25479: variable 'ansible_timeout' from source: unknown 19110 1726882577.25490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882577.25637: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882577.25646: variable 'omit' from source: magic vars 19110 1726882577.25652: starting attempt loop 19110 1726882577.25658: running the handler 19110 1726882577.25675: _low_level_execute_command(): starting 19110 1726882577.25682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882577.26316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.26320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.26345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19110 1726882577.26350: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.26668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.26685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882577.26699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.26832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.28437: stdout chunk (state=3): >>>/root <<< 19110 1726882577.28621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882577.28624: stdout chunk (state=3): >>><<< 19110 1726882577.28627: stderr chunk (state=3): >>><<< 19110 1726882577.28742: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882577.28746: _low_level_execute_command(): starting 19110 1726882577.28749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989 `" && echo ansible-tmp-1726882577.2864575-20678-81066145729989="` echo /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989 `" ) && sleep 0' 19110 1726882577.29731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882577.29745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.29762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.29782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.29827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.29860: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882577.29877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.30522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882577.30539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882577.30550: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882577.30566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.30587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.30613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.30630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.30640: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882577.30652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.30739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.30785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882577.30800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.30936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.32822: stdout chunk (state=3): >>>ansible-tmp-1726882577.2864575-20678-81066145729989=/root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989 <<< 19110 1726882577.32942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882577.33024: stderr chunk (state=3): >>><<< 19110 1726882577.33038: stdout chunk (state=3): >>><<< 19110 1726882577.33175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882577.2864575-20678-81066145729989=/root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882577.33179: variable 'ansible_module_compression' from source: unknown 19110 1726882577.33294: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19110 1726882577.33297: variable 'ansible_facts' from source: unknown 19110 1726882577.33450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/AnsiballZ_package_facts.py 19110 1726882577.33629: Sending initial data 19110 1726882577.33632: Sent initial data (161 bytes) 19110 1726882577.34634: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882577.34649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.34670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.34689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.34739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.34752: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882577.34772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.34792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882577.34804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882577.34815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882577.34831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.34849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.34872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.34888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.34902: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882577.34918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.35005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.35027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882577.35045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.35188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.36913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882577.37004: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882577.37101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp07mb3gkt /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/AnsiballZ_package_facts.py <<< 19110 1726882577.37190: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882577.40390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882577.40593: stderr chunk (state=3): >>><<< 19110 1726882577.40596: stdout chunk (state=3): >>><<< 19110 1726882577.40599: done transferring module to remote 19110 1726882577.40601: _low_level_execute_command(): starting 19110 1726882577.40603: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/ /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/AnsiballZ_package_facts.py && sleep 0' 19110 1726882577.41905: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882577.41913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.41922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.41935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.41972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.41978: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882577.41988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.42000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882577.42007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882577.42018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882577.42029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.42041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.42056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.42078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.42089: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882577.42102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.42340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.42355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882577.42371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.42582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.44408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882577.44411: stdout chunk (state=3): >>><<< 19110 1726882577.44414: stderr chunk (state=3): >>><<< 19110 1726882577.44469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882577.44473: _low_level_execute_command(): starting 19110 1726882577.44475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/AnsiballZ_package_facts.py && sleep 0' 19110 1726882577.46183: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882577.46197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.46212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.46232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.46277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.46290: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882577.46304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.46322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882577.46335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882577.46347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882577.46365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882577.46382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882577.46399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882577.46411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882577.46423: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882577.46437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882577.46511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882577.46528: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882577.46544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882577.46780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882577.92445: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 19110 1726882577.92468: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 19110 1726882577.92484: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 19110 1726882577.92488: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 19110 1726882577.92586: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 19110 1726882577.92666: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19110 1726882577.94130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882577.94209: stderr chunk (state=3): >>><<< 19110 1726882577.94212: stdout chunk (state=3): >>><<< 19110 1726882577.94279: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882577.99152: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882577.99185: _low_level_execute_command(): starting 19110 1726882577.99232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882577.2864575-20678-81066145729989/ > /dev/null 2>&1 && sleep 0' 19110 1726882578.02048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882578.02126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882578.02143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882578.02170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882578.02213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882578.02342: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882578.02361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882578.02382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882578.02395: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882578.02408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882578.02421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882578.02437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882578.02459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882578.02475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882578.02487: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882578.02502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882578.02588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882578.02675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882578.02692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882578.02820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882578.04745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882578.04748: stdout chunk (state=3): >>><<< 19110 1726882578.04750: stderr chunk (state=3): >>><<< 19110 1726882578.04871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882578.04875: handler run complete 19110 1726882578.06261: variable 'ansible_facts' from source: unknown 19110 1726882578.07128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.11988: variable 'ansible_facts' from source: unknown 19110 1726882578.13139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.14950: attempt loop complete, returning result 19110 1726882578.15095: _execute() done 19110 1726882578.15103: dumping result to json 19110 1726882578.15549: done dumping result, returning 19110 1726882578.15570: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-5372-c19a-00000000045e] 19110 1726882578.15628: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045e 19110 1726882578.19739: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000045e 19110 1726882578.19742: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882578.19975: no more pending results, returning what we have 19110 1726882578.19979: results queue empty 19110 1726882578.19980: checking for any_errors_fatal 19110 1726882578.19988: done checking for any_errors_fatal 19110 1726882578.19989: checking for max_fail_percentage 19110 1726882578.19990: done checking for max_fail_percentage 19110 1726882578.19991: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.19992: done checking to see if all hosts have failed 19110 1726882578.19993: getting the remaining hosts for this loop 19110 1726882578.19995: done getting the remaining hosts for this loop 19110 1726882578.19999: getting the next task for host managed_node1 19110 1726882578.20007: done getting next task for host managed_node1 19110 1726882578.20011: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882578.20013: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.20023: getting variables 19110 1726882578.20025: in VariableManager get_vars() 19110 1726882578.20066: Calling all_inventory to load vars for managed_node1 19110 1726882578.20069: Calling groups_inventory to load vars for managed_node1 19110 1726882578.20072: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.20083: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.20086: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.20089: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.23634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.27295: done with get_vars() 19110 1726882578.27318: done getting variables 19110 1726882578.27989: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:18 -0400 (0:00:01.041) 0:00:35.137 ****** 19110 1726882578.28022: entering _queue_task() for managed_node1/debug 19110 1726882578.28335: worker is 1 (out of 1 available) 19110 1726882578.28347: exiting _queue_task() for managed_node1/debug 19110 1726882578.28361: done queuing things up, now waiting for results queue to drain 19110 1726882578.28362: waiting for pending results... 19110 1726882578.28753: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 19110 1726882578.28878: in run() - task 0e448fcc-3ce9-5372-c19a-00000000005d 19110 1726882578.28903: variable 'ansible_search_path' from source: unknown 19110 1726882578.28916: variable 'ansible_search_path' from source: unknown 19110 1726882578.28959: calling self._execute() 19110 1726882578.29069: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.29082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.29095: variable 'omit' from source: magic vars 19110 1726882578.29514: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.29560: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882578.29578: variable 'omit' from source: magic vars 19110 1726882578.29620: variable 'omit' from source: magic vars 19110 1726882578.29763: variable 'network_provider' from source: set_fact 19110 1726882578.29794: variable 'omit' from source: magic vars 19110 1726882578.29839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882578.29885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882578.29915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882578.29936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882578.29951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882578.29989: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882578.29998: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.30012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.30127: Set connection var ansible_timeout to 10 19110 1726882578.30145: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882578.30157: Set connection var ansible_shell_executable to /bin/sh 19110 1726882578.30167: Set connection var ansible_shell_type to sh 19110 1726882578.30174: Set connection var ansible_connection to ssh 19110 1726882578.30183: Set connection var ansible_pipelining to False 19110 1726882578.30208: variable 'ansible_shell_executable' from source: unknown 19110 1726882578.30219: variable 'ansible_connection' from source: unknown 19110 1726882578.30232: variable 'ansible_module_compression' from source: unknown 19110 1726882578.30239: variable 'ansible_shell_type' from source: unknown 19110 1726882578.30246: variable 'ansible_shell_executable' from source: unknown 19110 1726882578.30254: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.30268: variable 'ansible_pipelining' from source: unknown 19110 1726882578.30276: variable 'ansible_timeout' from source: unknown 19110 1726882578.30283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.30430: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882578.30453: variable 'omit' from source: magic vars 19110 1726882578.30467: starting attempt loop 19110 1726882578.30474: running the handler 19110 1726882578.30523: handler run complete 19110 1726882578.30542: attempt loop complete, returning result 19110 1726882578.30558: _execute() done 19110 1726882578.30567: dumping result to json 19110 1726882578.30574: done dumping result, returning 19110 1726882578.30585: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-5372-c19a-00000000005d] 19110 1726882578.30596: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005d ok: [managed_node1] => {} MSG: Using network provider: nm 19110 1726882578.30746: no more pending results, returning what we have 19110 1726882578.30750: results queue empty 19110 1726882578.30751: checking for any_errors_fatal 19110 1726882578.30765: done checking for any_errors_fatal 19110 1726882578.30766: checking for max_fail_percentage 19110 1726882578.30768: done checking for max_fail_percentage 19110 1726882578.30769: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.30770: done checking to see if all hosts have failed 19110 1726882578.30771: getting the remaining hosts for this loop 19110 1726882578.30772: done getting the remaining hosts for this loop 19110 1726882578.30776: getting the next task for host managed_node1 19110 1726882578.30783: done getting next task for host managed_node1 19110 1726882578.30788: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882578.30790: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.30802: getting variables 19110 1726882578.30804: in VariableManager get_vars() 19110 1726882578.30844: Calling all_inventory to load vars for managed_node1 19110 1726882578.30847: Calling groups_inventory to load vars for managed_node1 19110 1726882578.30850: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.30864: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.30885: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.30890: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.31883: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005d 19110 1726882578.31886: WORKER PROCESS EXITING 19110 1726882578.32766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.34653: done with get_vars() 19110 1726882578.34685: done getting variables 19110 1726882578.34747: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:18 -0400 (0:00:00.067) 0:00:35.205 ****** 19110 1726882578.34786: entering _queue_task() for managed_node1/fail 19110 1726882578.35114: worker is 1 (out of 1 available) 19110 1726882578.35131: exiting _queue_task() for managed_node1/fail 19110 1726882578.35142: done queuing things up, now waiting for results queue to drain 19110 1726882578.35144: waiting for pending results... 19110 1726882578.35451: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19110 1726882578.35587: in run() - task 0e448fcc-3ce9-5372-c19a-00000000005e 19110 1726882578.35609: variable 'ansible_search_path' from source: unknown 19110 1726882578.35618: variable 'ansible_search_path' from source: unknown 19110 1726882578.35669: calling self._execute() 19110 1726882578.35785: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.35800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.35819: variable 'omit' from source: magic vars 19110 1726882578.36225: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.36363: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882578.36612: variable 'network_state' from source: role '' defaults 19110 1726882578.36628: Evaluated conditional (network_state != {}): False 19110 1726882578.36678: when evaluation is False, skipping this task 19110 1726882578.36688: _execute() done 19110 1726882578.36696: dumping result to json 19110 1726882578.36704: done dumping result, returning 19110 1726882578.36715: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-5372-c19a-00000000005e] 19110 1726882578.36727: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005e 19110 1726882578.36898: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882578.36946: no more pending results, returning what we have 19110 1726882578.36950: results queue empty 19110 1726882578.36952: checking for any_errors_fatal 19110 1726882578.36962: done checking for any_errors_fatal 19110 1726882578.36965: checking for max_fail_percentage 19110 1726882578.36968: done checking for max_fail_percentage 19110 1726882578.36969: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.36970: done checking to see if all hosts have failed 19110 1726882578.36970: getting the remaining hosts for this loop 19110 1726882578.36972: done getting the remaining hosts for this loop 19110 1726882578.36976: getting the next task for host managed_node1 19110 1726882578.36983: done getting next task for host managed_node1 19110 1726882578.36988: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882578.36990: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.37007: getting variables 19110 1726882578.37009: in VariableManager get_vars() 19110 1726882578.37053: Calling all_inventory to load vars for managed_node1 19110 1726882578.37059: Calling groups_inventory to load vars for managed_node1 19110 1726882578.37062: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.37078: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.37081: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.37084: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.38112: WORKER PROCESS EXITING 19110 1726882578.40397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.44980: done with get_vars() 19110 1726882578.45010: done getting variables 19110 1726882578.45075: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:18 -0400 (0:00:00.103) 0:00:35.308 ****** 19110 1726882578.45106: entering _queue_task() for managed_node1/fail 19110 1726882578.45829: worker is 1 (out of 1 available) 19110 1726882578.45842: exiting _queue_task() for managed_node1/fail 19110 1726882578.45852: done queuing things up, now waiting for results queue to drain 19110 1726882578.45854: waiting for pending results... 19110 1726882578.46652: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19110 1726882578.46839: in run() - task 0e448fcc-3ce9-5372-c19a-00000000005f 19110 1726882578.46977: variable 'ansible_search_path' from source: unknown 19110 1726882578.46986: variable 'ansible_search_path' from source: unknown 19110 1726882578.47029: calling self._execute() 19110 1726882578.47161: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.47290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.47306: variable 'omit' from source: magic vars 19110 1726882578.48119: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.48137: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882578.48475: variable 'network_state' from source: role '' defaults 19110 1726882578.48490: Evaluated conditional (network_state != {}): False 19110 1726882578.48499: when evaluation is False, skipping this task 19110 1726882578.48508: _execute() done 19110 1726882578.48516: dumping result to json 19110 1726882578.48523: done dumping result, returning 19110 1726882578.48533: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-5372-c19a-00000000005f] 19110 1726882578.48546: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005f 19110 1726882578.48773: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000005f 19110 1726882578.48781: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882578.48840: no more pending results, returning what we have 19110 1726882578.48845: results queue empty 19110 1726882578.48846: checking for any_errors_fatal 19110 1726882578.48854: done checking for any_errors_fatal 19110 1726882578.48858: checking for max_fail_percentage 19110 1726882578.48860: done checking for max_fail_percentage 19110 1726882578.48861: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.48862: done checking to see if all hosts have failed 19110 1726882578.48862: getting the remaining hosts for this loop 19110 1726882578.48867: done getting the remaining hosts for this loop 19110 1726882578.48871: getting the next task for host managed_node1 19110 1726882578.48879: done getting next task for host managed_node1 19110 1726882578.48883: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882578.48885: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.48900: getting variables 19110 1726882578.48902: in VariableManager get_vars() 19110 1726882578.48941: Calling all_inventory to load vars for managed_node1 19110 1726882578.48945: Calling groups_inventory to load vars for managed_node1 19110 1726882578.48948: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.48963: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.48969: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.48972: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.51717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.55687: done with get_vars() 19110 1726882578.55721: done getting variables 19110 1726882578.55980: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:18 -0400 (0:00:00.109) 0:00:35.417 ****** 19110 1726882578.56015: entering _queue_task() for managed_node1/fail 19110 1726882578.56766: worker is 1 (out of 1 available) 19110 1726882578.56779: exiting _queue_task() for managed_node1/fail 19110 1726882578.56790: done queuing things up, now waiting for results queue to drain 19110 1726882578.56792: waiting for pending results... 19110 1726882578.58245: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19110 1726882578.58358: in run() - task 0e448fcc-3ce9-5372-c19a-000000000060 19110 1726882578.59287: variable 'ansible_search_path' from source: unknown 19110 1726882578.59298: variable 'ansible_search_path' from source: unknown 19110 1726882578.59343: calling self._execute() 19110 1726882578.59441: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.59454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.59472: variable 'omit' from source: magic vars 19110 1726882578.59853: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.60784: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882578.60959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882578.67160: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882578.67244: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882578.67405: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882578.67445: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882578.67592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882578.67676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882578.67715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882578.67826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.67875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882578.67923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882578.68107: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.68247: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19110 1726882578.68254: when evaluation is False, skipping this task 19110 1726882578.68261: _execute() done 19110 1726882578.68270: dumping result to json 19110 1726882578.68277: done dumping result, returning 19110 1726882578.68290: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-5372-c19a-000000000060] 19110 1726882578.68300: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19110 1726882578.68457: no more pending results, returning what we have 19110 1726882578.68462: results queue empty 19110 1726882578.68465: checking for any_errors_fatal 19110 1726882578.68472: done checking for any_errors_fatal 19110 1726882578.68472: checking for max_fail_percentage 19110 1726882578.68474: done checking for max_fail_percentage 19110 1726882578.68476: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.68477: done checking to see if all hosts have failed 19110 1726882578.68477: getting the remaining hosts for this loop 19110 1726882578.68479: done getting the remaining hosts for this loop 19110 1726882578.68483: getting the next task for host managed_node1 19110 1726882578.68489: done getting next task for host managed_node1 19110 1726882578.68493: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882578.68496: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.68510: getting variables 19110 1726882578.68511: in VariableManager get_vars() 19110 1726882578.68552: Calling all_inventory to load vars for managed_node1 19110 1726882578.68555: Calling groups_inventory to load vars for managed_node1 19110 1726882578.68558: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.68570: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.68574: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.68577: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.69629: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000060 19110 1726882578.69632: WORKER PROCESS EXITING 19110 1726882578.72023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.75149: done with get_vars() 19110 1726882578.75176: done getting variables 19110 1726882578.75230: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:18 -0400 (0:00:00.192) 0:00:35.609 ****** 19110 1726882578.75258: entering _queue_task() for managed_node1/dnf 19110 1726882578.76061: worker is 1 (out of 1 available) 19110 1726882578.76075: exiting _queue_task() for managed_node1/dnf 19110 1726882578.76087: done queuing things up, now waiting for results queue to drain 19110 1726882578.76089: waiting for pending results... 19110 1726882578.77546: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19110 1726882578.77857: in run() - task 0e448fcc-3ce9-5372-c19a-000000000061 19110 1726882578.77881: variable 'ansible_search_path' from source: unknown 19110 1726882578.77914: variable 'ansible_search_path' from source: unknown 19110 1726882578.77959: calling self._execute() 19110 1726882578.78172: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882578.78244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882578.78261: variable 'omit' from source: magic vars 19110 1726882578.79097: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.79213: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882578.79663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882578.85714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882578.85795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882578.85906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882578.86082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882578.86114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882578.86313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882578.86347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882578.86495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.86540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882578.86562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882578.86834: variable 'ansible_distribution' from source: facts 19110 1726882578.86927: variable 'ansible_distribution_major_version' from source: facts 19110 1726882578.86947: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19110 1726882578.87121: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882578.87502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882578.87532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882578.87567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.87726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882578.87747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882578.87804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882578.87897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882578.87931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.88059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882578.88081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882578.88240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882578.88275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882578.88304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.88470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882578.88491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882578.88662: variable 'network_connections' from source: play vars 19110 1726882578.88896: variable 'profile' from source: play vars 19110 1726882578.88967: variable 'profile' from source: play vars 19110 1726882578.88977: variable 'interface' from source: set_fact 19110 1726882578.89042: variable 'interface' from source: set_fact 19110 1726882578.89175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882578.89448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882578.89495: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882578.89533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882578.89575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882578.89628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882578.89731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882578.89778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882578.89810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882578.89914: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882578.90456: variable 'network_connections' from source: play vars 19110 1726882578.90534: variable 'profile' from source: play vars 19110 1726882578.90624: variable 'profile' from source: play vars 19110 1726882578.90693: variable 'interface' from source: set_fact 19110 1726882578.90830: variable 'interface' from source: set_fact 19110 1726882578.90864: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882578.90908: when evaluation is False, skipping this task 19110 1726882578.90921: _execute() done 19110 1726882578.90928: dumping result to json 19110 1726882578.90935: done dumping result, returning 19110 1726882578.90947: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000061] 19110 1726882578.90997: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000061 19110 1726882578.91162: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000061 19110 1726882578.91173: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882578.91226: no more pending results, returning what we have 19110 1726882578.91230: results queue empty 19110 1726882578.91231: checking for any_errors_fatal 19110 1726882578.91239: done checking for any_errors_fatal 19110 1726882578.91240: checking for max_fail_percentage 19110 1726882578.91242: done checking for max_fail_percentage 19110 1726882578.91243: checking to see if all hosts have failed and the running result is not ok 19110 1726882578.91244: done checking to see if all hosts have failed 19110 1726882578.91244: getting the remaining hosts for this loop 19110 1726882578.91246: done getting the remaining hosts for this loop 19110 1726882578.91251: getting the next task for host managed_node1 19110 1726882578.91258: done getting next task for host managed_node1 19110 1726882578.91264: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882578.91266: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882578.91282: getting variables 19110 1726882578.91284: in VariableManager get_vars() 19110 1726882578.91325: Calling all_inventory to load vars for managed_node1 19110 1726882578.91328: Calling groups_inventory to load vars for managed_node1 19110 1726882578.91330: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882578.91341: Calling all_plugins_play to load vars for managed_node1 19110 1726882578.91344: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882578.91348: Calling groups_plugins_play to load vars for managed_node1 19110 1726882578.94544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882578.99304: done with get_vars() 19110 1726882578.99339: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19110 1726882578.99420: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:18 -0400 (0:00:00.241) 0:00:35.851 ****** 19110 1726882578.99451: entering _queue_task() for managed_node1/yum 19110 1726882579.00197: worker is 1 (out of 1 available) 19110 1726882579.00210: exiting _queue_task() for managed_node1/yum 19110 1726882579.00221: done queuing things up, now waiting for results queue to drain 19110 1726882579.00223: waiting for pending results... 19110 1726882579.01096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19110 1726882579.01333: in run() - task 0e448fcc-3ce9-5372-c19a-000000000062 19110 1726882579.01469: variable 'ansible_search_path' from source: unknown 19110 1726882579.01480: variable 'ansible_search_path' from source: unknown 19110 1726882579.01523: calling self._execute() 19110 1726882579.01648: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.01779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.01793: variable 'omit' from source: magic vars 19110 1726882579.02482: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.02585: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.02992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882579.05411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882579.05488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882579.05528: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882579.05572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882579.05611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882579.05706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.05741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.05775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.05830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.05849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.05954: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.05976: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19110 1726882579.05984: when evaluation is False, skipping this task 19110 1726882579.05992: _execute() done 19110 1726882579.05998: dumping result to json 19110 1726882579.06011: done dumping result, returning 19110 1726882579.06028: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000062] 19110 1726882579.06040: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000062 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19110 1726882579.06200: no more pending results, returning what we have 19110 1726882579.06205: results queue empty 19110 1726882579.06206: checking for any_errors_fatal 19110 1726882579.06213: done checking for any_errors_fatal 19110 1726882579.06214: checking for max_fail_percentage 19110 1726882579.06216: done checking for max_fail_percentage 19110 1726882579.06217: checking to see if all hosts have failed and the running result is not ok 19110 1726882579.06217: done checking to see if all hosts have failed 19110 1726882579.06218: getting the remaining hosts for this loop 19110 1726882579.06220: done getting the remaining hosts for this loop 19110 1726882579.06224: getting the next task for host managed_node1 19110 1726882579.06231: done getting next task for host managed_node1 19110 1726882579.06235: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882579.06237: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882579.06250: getting variables 19110 1726882579.06252: in VariableManager get_vars() 19110 1726882579.06298: Calling all_inventory to load vars for managed_node1 19110 1726882579.06302: Calling groups_inventory to load vars for managed_node1 19110 1726882579.06304: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882579.06315: Calling all_plugins_play to load vars for managed_node1 19110 1726882579.06319: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882579.06322: Calling groups_plugins_play to load vars for managed_node1 19110 1726882579.07362: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000062 19110 1726882579.07367: WORKER PROCESS EXITING 19110 1726882579.08183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882579.09924: done with get_vars() 19110 1726882579.09948: done getting variables 19110 1726882579.10010: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:19 -0400 (0:00:00.105) 0:00:35.957 ****** 19110 1726882579.10041: entering _queue_task() for managed_node1/fail 19110 1726882579.10340: worker is 1 (out of 1 available) 19110 1726882579.10352: exiting _queue_task() for managed_node1/fail 19110 1726882579.10365: done queuing things up, now waiting for results queue to drain 19110 1726882579.10367: waiting for pending results... 19110 1726882579.10644: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19110 1726882579.10765: in run() - task 0e448fcc-3ce9-5372-c19a-000000000063 19110 1726882579.10785: variable 'ansible_search_path' from source: unknown 19110 1726882579.10794: variable 'ansible_search_path' from source: unknown 19110 1726882579.10837: calling self._execute() 19110 1726882579.10934: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.10948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.10969: variable 'omit' from source: magic vars 19110 1726882579.11352: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.11374: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.11507: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882579.11712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882579.20662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882579.20824: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882579.20876: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882579.20988: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882579.21069: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882579.21215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.21383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.21416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.21490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.21511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.21621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.21709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.21739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.21902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.21929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.21981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.22119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.22159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.22206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.22239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.23398: variable 'network_connections' from source: play vars 19110 1726882579.23457: variable 'profile' from source: play vars 19110 1726882579.23625: variable 'profile' from source: play vars 19110 1726882579.23778: variable 'interface' from source: set_fact 19110 1726882579.23942: variable 'interface' from source: set_fact 19110 1726882579.24135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882579.24546: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882579.24592: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882579.24627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882579.24676: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882579.24797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882579.24888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882579.24917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.24999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882579.25039: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882579.25494: variable 'network_connections' from source: play vars 19110 1726882579.25631: variable 'profile' from source: play vars 19110 1726882579.25697: variable 'profile' from source: play vars 19110 1726882579.25738: variable 'interface' from source: set_fact 19110 1726882579.25875: variable 'interface' from source: set_fact 19110 1726882579.25901: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882579.25953: when evaluation is False, skipping this task 19110 1726882579.25964: _execute() done 19110 1726882579.25971: dumping result to json 19110 1726882579.25978: done dumping result, returning 19110 1726882579.25989: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000063] 19110 1726882579.26070: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000063 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882579.26213: no more pending results, returning what we have 19110 1726882579.26217: results queue empty 19110 1726882579.26218: checking for any_errors_fatal 19110 1726882579.26224: done checking for any_errors_fatal 19110 1726882579.26225: checking for max_fail_percentage 19110 1726882579.26227: done checking for max_fail_percentage 19110 1726882579.26228: checking to see if all hosts have failed and the running result is not ok 19110 1726882579.26229: done checking to see if all hosts have failed 19110 1726882579.26230: getting the remaining hosts for this loop 19110 1726882579.26231: done getting the remaining hosts for this loop 19110 1726882579.26235: getting the next task for host managed_node1 19110 1726882579.26240: done getting next task for host managed_node1 19110 1726882579.26244: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19110 1726882579.26246: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882579.26262: getting variables 19110 1726882579.26268: in VariableManager get_vars() 19110 1726882579.26305: Calling all_inventory to load vars for managed_node1 19110 1726882579.26308: Calling groups_inventory to load vars for managed_node1 19110 1726882579.26310: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882579.26320: Calling all_plugins_play to load vars for managed_node1 19110 1726882579.26322: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882579.26325: Calling groups_plugins_play to load vars for managed_node1 19110 1726882579.27498: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000063 19110 1726882579.27502: WORKER PROCESS EXITING 19110 1726882579.39524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882579.43748: done with get_vars() 19110 1726882579.43783: done getting variables 19110 1726882579.43931: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:19 -0400 (0:00:00.339) 0:00:36.296 ****** 19110 1726882579.43966: entering _queue_task() for managed_node1/package 19110 1726882579.44677: worker is 1 (out of 1 available) 19110 1726882579.44815: exiting _queue_task() for managed_node1/package 19110 1726882579.44828: done queuing things up, now waiting for results queue to drain 19110 1726882579.44829: waiting for pending results... 19110 1726882579.45780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 19110 1726882579.46101: in run() - task 0e448fcc-3ce9-5372-c19a-000000000064 19110 1726882579.46154: variable 'ansible_search_path' from source: unknown 19110 1726882579.46237: variable 'ansible_search_path' from source: unknown 19110 1726882579.46290: calling self._execute() 19110 1726882579.46495: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.46584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.46599: variable 'omit' from source: magic vars 19110 1726882579.47566: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.47586: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.48008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882579.48592: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882579.48700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882579.48824: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882579.49007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882579.49221: variable 'network_packages' from source: role '' defaults 19110 1726882579.49498: variable '__network_provider_setup' from source: role '' defaults 19110 1726882579.49527: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882579.49698: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882579.49743: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882579.49809: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882579.50252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882579.55126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882579.55339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882579.55442: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882579.55521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882579.55632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882579.55771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.55899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.55968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.56088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.56149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.56262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.56314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.56406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.56505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.56583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.56992: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882579.57338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.57374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.57467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.57510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.57561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.57743: variable 'ansible_python' from source: facts 19110 1726882579.57812: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882579.58031: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882579.58236: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882579.58486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.58543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.58596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.58645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.58674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.58722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882579.58772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882579.58801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.58851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882579.58879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882579.59034: variable 'network_connections' from source: play vars 19110 1726882579.59044: variable 'profile' from source: play vars 19110 1726882579.59345: variable 'profile' from source: play vars 19110 1726882579.59361: variable 'interface' from source: set_fact 19110 1726882579.59550: variable 'interface' from source: set_fact 19110 1726882579.59624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882579.59695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882579.59730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882579.59892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882579.59940: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882579.60809: variable 'network_connections' from source: play vars 19110 1726882579.60818: variable 'profile' from source: play vars 19110 1726882579.60928: variable 'profile' from source: play vars 19110 1726882579.60987: variable 'interface' from source: set_fact 19110 1726882579.61077: variable 'interface' from source: set_fact 19110 1726882579.61123: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882579.61232: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882579.61572: variable 'network_connections' from source: play vars 19110 1726882579.61583: variable 'profile' from source: play vars 19110 1726882579.61658: variable 'profile' from source: play vars 19110 1726882579.61671: variable 'interface' from source: set_fact 19110 1726882579.61781: variable 'interface' from source: set_fact 19110 1726882579.61818: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882579.61900: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882579.62238: variable 'network_connections' from source: play vars 19110 1726882579.62258: variable 'profile' from source: play vars 19110 1726882579.62324: variable 'profile' from source: play vars 19110 1726882579.62333: variable 'interface' from source: set_fact 19110 1726882579.62440: variable 'interface' from source: set_fact 19110 1726882579.62511: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882579.62582: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882579.62594: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882579.62657: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882579.63170: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882579.64198: variable 'network_connections' from source: play vars 19110 1726882579.64360: variable 'profile' from source: play vars 19110 1726882579.64425: variable 'profile' from source: play vars 19110 1726882579.64434: variable 'interface' from source: set_fact 19110 1726882579.64586: variable 'interface' from source: set_fact 19110 1726882579.64651: variable 'ansible_distribution' from source: facts 19110 1726882579.64689: variable '__network_rh_distros' from source: role '' defaults 19110 1726882579.64737: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.64753: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882579.65198: variable 'ansible_distribution' from source: facts 19110 1726882579.65251: variable '__network_rh_distros' from source: role '' defaults 19110 1726882579.65275: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.65293: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882579.65497: variable 'ansible_distribution' from source: facts 19110 1726882579.65506: variable '__network_rh_distros' from source: role '' defaults 19110 1726882579.65515: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.65566: variable 'network_provider' from source: set_fact 19110 1726882579.65587: variable 'ansible_facts' from source: unknown 19110 1726882579.66333: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19110 1726882579.66341: when evaluation is False, skipping this task 19110 1726882579.66347: _execute() done 19110 1726882579.66353: dumping result to json 19110 1726882579.66363: done dumping result, returning 19110 1726882579.66377: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-5372-c19a-000000000064] 19110 1726882579.66387: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000064 19110 1726882579.66506: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000064 19110 1726882579.66513: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19110 1726882579.66577: no more pending results, returning what we have 19110 1726882579.66581: results queue empty 19110 1726882579.66582: checking for any_errors_fatal 19110 1726882579.66590: done checking for any_errors_fatal 19110 1726882579.66591: checking for max_fail_percentage 19110 1726882579.66593: done checking for max_fail_percentage 19110 1726882579.66594: checking to see if all hosts have failed and the running result is not ok 19110 1726882579.66595: done checking to see if all hosts have failed 19110 1726882579.66596: getting the remaining hosts for this loop 19110 1726882579.66597: done getting the remaining hosts for this loop 19110 1726882579.66601: getting the next task for host managed_node1 19110 1726882579.66607: done getting next task for host managed_node1 19110 1726882579.66611: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882579.66613: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882579.66627: getting variables 19110 1726882579.66629: in VariableManager get_vars() 19110 1726882579.66676: Calling all_inventory to load vars for managed_node1 19110 1726882579.66679: Calling groups_inventory to load vars for managed_node1 19110 1726882579.66682: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882579.66698: Calling all_plugins_play to load vars for managed_node1 19110 1726882579.66701: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882579.66704: Calling groups_plugins_play to load vars for managed_node1 19110 1726882579.70849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882579.74390: done with get_vars() 19110 1726882579.74533: done getting variables 19110 1726882579.74600: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:19 -0400 (0:00:00.307) 0:00:36.604 ****** 19110 1726882579.74750: entering _queue_task() for managed_node1/package 19110 1726882579.75541: worker is 1 (out of 1 available) 19110 1726882579.75558: exiting _queue_task() for managed_node1/package 19110 1726882579.75571: done queuing things up, now waiting for results queue to drain 19110 1726882579.75573: waiting for pending results... 19110 1726882579.75798: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19110 1726882579.75899: in run() - task 0e448fcc-3ce9-5372-c19a-000000000065 19110 1726882579.75913: variable 'ansible_search_path' from source: unknown 19110 1726882579.75917: variable 'ansible_search_path' from source: unknown 19110 1726882579.75962: calling self._execute() 19110 1726882579.76054: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.76068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.76078: variable 'omit' from source: magic vars 19110 1726882579.76462: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.76474: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.76640: variable 'network_state' from source: role '' defaults 19110 1726882579.76649: Evaluated conditional (network_state != {}): False 19110 1726882579.76652: when evaluation is False, skipping this task 19110 1726882579.76657: _execute() done 19110 1726882579.76660: dumping result to json 19110 1726882579.76662: done dumping result, returning 19110 1726882579.76671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000065] 19110 1726882579.76677: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000065 19110 1726882579.76781: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000065 19110 1726882579.76784: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882579.76837: no more pending results, returning what we have 19110 1726882579.76841: results queue empty 19110 1726882579.76842: checking for any_errors_fatal 19110 1726882579.76850: done checking for any_errors_fatal 19110 1726882579.76851: checking for max_fail_percentage 19110 1726882579.76853: done checking for max_fail_percentage 19110 1726882579.76854: checking to see if all hosts have failed and the running result is not ok 19110 1726882579.76858: done checking to see if all hosts have failed 19110 1726882579.76858: getting the remaining hosts for this loop 19110 1726882579.76860: done getting the remaining hosts for this loop 19110 1726882579.76866: getting the next task for host managed_node1 19110 1726882579.76874: done getting next task for host managed_node1 19110 1726882579.76879: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882579.76882: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882579.76899: getting variables 19110 1726882579.76901: in VariableManager get_vars() 19110 1726882579.76947: Calling all_inventory to load vars for managed_node1 19110 1726882579.76950: Calling groups_inventory to load vars for managed_node1 19110 1726882579.76952: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882579.76968: Calling all_plugins_play to load vars for managed_node1 19110 1726882579.76971: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882579.76974: Calling groups_plugins_play to load vars for managed_node1 19110 1726882579.79027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882579.80964: done with get_vars() 19110 1726882579.80987: done getting variables 19110 1726882579.81041: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:19 -0400 (0:00:00.064) 0:00:36.669 ****** 19110 1726882579.81189: entering _queue_task() for managed_node1/package 19110 1726882579.81823: worker is 1 (out of 1 available) 19110 1726882579.81837: exiting _queue_task() for managed_node1/package 19110 1726882579.81848: done queuing things up, now waiting for results queue to drain 19110 1726882579.81849: waiting for pending results... 19110 1726882579.82410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19110 1726882579.82536: in run() - task 0e448fcc-3ce9-5372-c19a-000000000066 19110 1726882579.82553: variable 'ansible_search_path' from source: unknown 19110 1726882579.82560: variable 'ansible_search_path' from source: unknown 19110 1726882579.82609: calling self._execute() 19110 1726882579.82719: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.82732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.82748: variable 'omit' from source: magic vars 19110 1726882579.83163: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.83193: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.83336: variable 'network_state' from source: role '' defaults 19110 1726882579.83357: Evaluated conditional (network_state != {}): False 19110 1726882579.83367: when evaluation is False, skipping this task 19110 1726882579.83376: _execute() done 19110 1726882579.83389: dumping result to json 19110 1726882579.83399: done dumping result, returning 19110 1726882579.83411: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-5372-c19a-000000000066] 19110 1726882579.83424: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000066 19110 1726882579.83552: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000066 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882579.83604: no more pending results, returning what we have 19110 1726882579.83609: results queue empty 19110 1726882579.83610: checking for any_errors_fatal 19110 1726882579.83618: done checking for any_errors_fatal 19110 1726882579.83619: checking for max_fail_percentage 19110 1726882579.83620: done checking for max_fail_percentage 19110 1726882579.83621: checking to see if all hosts have failed and the running result is not ok 19110 1726882579.83622: done checking to see if all hosts have failed 19110 1726882579.83623: getting the remaining hosts for this loop 19110 1726882579.83624: done getting the remaining hosts for this loop 19110 1726882579.83628: getting the next task for host managed_node1 19110 1726882579.83635: done getting next task for host managed_node1 19110 1726882579.83640: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882579.83642: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882579.83658: getting variables 19110 1726882579.83660: in VariableManager get_vars() 19110 1726882579.83715: Calling all_inventory to load vars for managed_node1 19110 1726882579.83719: Calling groups_inventory to load vars for managed_node1 19110 1726882579.83721: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882579.83734: Calling all_plugins_play to load vars for managed_node1 19110 1726882579.83737: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882579.83740: Calling groups_plugins_play to load vars for managed_node1 19110 1726882579.85014: WORKER PROCESS EXITING 19110 1726882579.86882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882579.90608: done with get_vars() 19110 1726882579.90674: done getting variables 19110 1726882579.90738: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:19 -0400 (0:00:00.096) 0:00:36.766 ****** 19110 1726882579.90887: entering _queue_task() for managed_node1/service 19110 1726882579.91578: worker is 1 (out of 1 available) 19110 1726882579.91589: exiting _queue_task() for managed_node1/service 19110 1726882579.91600: done queuing things up, now waiting for results queue to drain 19110 1726882579.91602: waiting for pending results... 19110 1726882579.92539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19110 1726882579.92656: in run() - task 0e448fcc-3ce9-5372-c19a-000000000067 19110 1726882579.92785: variable 'ansible_search_path' from source: unknown 19110 1726882579.92793: variable 'ansible_search_path' from source: unknown 19110 1726882579.92841: calling self._execute() 19110 1726882579.93007: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882579.93144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882579.93158: variable 'omit' from source: magic vars 19110 1726882579.94002: variable 'ansible_distribution_major_version' from source: facts 19110 1726882579.94143: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882579.94400: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882579.94999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882580.00580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882580.00793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882580.00837: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882580.00997: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882580.01041: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882580.01260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.01305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.01453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.01518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.01568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.01703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.01733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.01903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.01966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.02006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.02123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.02200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.02232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.02360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.02436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.02826: variable 'network_connections' from source: play vars 19110 1726882580.02884: variable 'profile' from source: play vars 19110 1726882580.03032: variable 'profile' from source: play vars 19110 1726882580.03172: variable 'interface' from source: set_fact 19110 1726882580.03243: variable 'interface' from source: set_fact 19110 1726882580.03337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882580.03777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882580.03871: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882580.03985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882580.04078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882580.04124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882580.04290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882580.04323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.04354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882580.04422: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882580.04901: variable 'network_connections' from source: play vars 19110 1726882580.05035: variable 'profile' from source: play vars 19110 1726882580.05183: variable 'profile' from source: play vars 19110 1726882580.05192: variable 'interface' from source: set_fact 19110 1726882580.05387: variable 'interface' from source: set_fact 19110 1726882580.05428: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19110 1726882580.05468: when evaluation is False, skipping this task 19110 1726882580.05483: _execute() done 19110 1726882580.05497: dumping result to json 19110 1726882580.05576: done dumping result, returning 19110 1726882580.05592: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-5372-c19a-000000000067] 19110 1726882580.05613: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000067 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19110 1726882580.05776: no more pending results, returning what we have 19110 1726882580.05780: results queue empty 19110 1726882580.05781: checking for any_errors_fatal 19110 1726882580.05791: done checking for any_errors_fatal 19110 1726882580.05792: checking for max_fail_percentage 19110 1726882580.05793: done checking for max_fail_percentage 19110 1726882580.05794: checking to see if all hosts have failed and the running result is not ok 19110 1726882580.05795: done checking to see if all hosts have failed 19110 1726882580.05796: getting the remaining hosts for this loop 19110 1726882580.05797: done getting the remaining hosts for this loop 19110 1726882580.05801: getting the next task for host managed_node1 19110 1726882580.05808: done getting next task for host managed_node1 19110 1726882580.05812: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882580.05814: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882580.05829: getting variables 19110 1726882580.05831: in VariableManager get_vars() 19110 1726882580.05882: Calling all_inventory to load vars for managed_node1 19110 1726882580.05885: Calling groups_inventory to load vars for managed_node1 19110 1726882580.05888: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882580.05906: Calling all_plugins_play to load vars for managed_node1 19110 1726882580.05910: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882580.05914: Calling groups_plugins_play to load vars for managed_node1 19110 1726882580.06934: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000067 19110 1726882580.06937: WORKER PROCESS EXITING 19110 1726882580.08084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882580.09986: done with get_vars() 19110 1726882580.10015: done getting variables 19110 1726882580.10075: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:20 -0400 (0:00:00.192) 0:00:36.958 ****** 19110 1726882580.10104: entering _queue_task() for managed_node1/service 19110 1726882580.10418: worker is 1 (out of 1 available) 19110 1726882580.10430: exiting _queue_task() for managed_node1/service 19110 1726882580.10449: done queuing things up, now waiting for results queue to drain 19110 1726882580.10451: waiting for pending results... 19110 1726882580.10791: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19110 1726882580.10923: in run() - task 0e448fcc-3ce9-5372-c19a-000000000068 19110 1726882580.10952: variable 'ansible_search_path' from source: unknown 19110 1726882580.10963: variable 'ansible_search_path' from source: unknown 19110 1726882580.11021: calling self._execute() 19110 1726882580.11123: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882580.11138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882580.11156: variable 'omit' from source: magic vars 19110 1726882580.11605: variable 'ansible_distribution_major_version' from source: facts 19110 1726882580.11629: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882580.11813: variable 'network_provider' from source: set_fact 19110 1726882580.11830: variable 'network_state' from source: role '' defaults 19110 1726882580.11848: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19110 1726882580.11865: variable 'omit' from source: magic vars 19110 1726882580.11914: variable 'omit' from source: magic vars 19110 1726882580.11948: variable 'network_service_name' from source: role '' defaults 19110 1726882580.12033: variable 'network_service_name' from source: role '' defaults 19110 1726882580.12166: variable '__network_provider_setup' from source: role '' defaults 19110 1726882580.12188: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882580.12262: variable '__network_service_name_default_nm' from source: role '' defaults 19110 1726882580.12285: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882580.12360: variable '__network_packages_default_nm' from source: role '' defaults 19110 1726882580.12628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882580.15148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882580.15239: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882580.15294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882580.15338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882580.15374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882580.15458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.15498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.15537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.15604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.15625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.15690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.15723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.15757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.15813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.15835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.16101: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19110 1726882580.16256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.16296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.16333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.16381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.16411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.16527: variable 'ansible_python' from source: facts 19110 1726882580.16572: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19110 1726882580.16679: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882580.16773: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882580.16917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.16947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.16987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.17034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.17058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.17127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.17174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.17219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.17274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.17305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.17494: variable 'network_connections' from source: play vars 19110 1726882580.17507: variable 'profile' from source: play vars 19110 1726882580.17614: variable 'profile' from source: play vars 19110 1726882580.17633: variable 'interface' from source: set_fact 19110 1726882580.17716: variable 'interface' from source: set_fact 19110 1726882580.17858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882580.18106: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882580.18161: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882580.18219: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882580.18270: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882580.18348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882580.18398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882580.18447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.18493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882580.18554: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882580.18893: variable 'network_connections' from source: play vars 19110 1726882580.18909: variable 'profile' from source: play vars 19110 1726882580.18995: variable 'profile' from source: play vars 19110 1726882580.19006: variable 'interface' from source: set_fact 19110 1726882580.19084: variable 'interface' from source: set_fact 19110 1726882580.19134: variable '__network_packages_default_wireless' from source: role '' defaults 19110 1726882580.19228: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882580.19533: variable 'network_connections' from source: play vars 19110 1726882580.19542: variable 'profile' from source: play vars 19110 1726882580.19624: variable 'profile' from source: play vars 19110 1726882580.19633: variable 'interface' from source: set_fact 19110 1726882580.19716: variable 'interface' from source: set_fact 19110 1726882580.19747: variable '__network_packages_default_team' from source: role '' defaults 19110 1726882580.19838: variable '__network_team_connections_defined' from source: role '' defaults 19110 1726882580.20153: variable 'network_connections' from source: play vars 19110 1726882580.20169: variable 'profile' from source: play vars 19110 1726882580.20245: variable 'profile' from source: play vars 19110 1726882580.20260: variable 'interface' from source: set_fact 19110 1726882580.20336: variable 'interface' from source: set_fact 19110 1726882580.20405: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882580.20476: variable '__network_service_name_default_initscripts' from source: role '' defaults 19110 1726882580.20488: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882580.20548: variable '__network_packages_default_initscripts' from source: role '' defaults 19110 1726882580.20791: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19110 1726882580.21332: variable 'network_connections' from source: play vars 19110 1726882580.21343: variable 'profile' from source: play vars 19110 1726882580.21411: variable 'profile' from source: play vars 19110 1726882580.21421: variable 'interface' from source: set_fact 19110 1726882580.21598: variable 'interface' from source: set_fact 19110 1726882580.21611: variable 'ansible_distribution' from source: facts 19110 1726882580.21661: variable '__network_rh_distros' from source: role '' defaults 19110 1726882580.21677: variable 'ansible_distribution_major_version' from source: facts 19110 1726882580.21694: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19110 1726882580.21943: variable 'ansible_distribution' from source: facts 19110 1726882580.21952: variable '__network_rh_distros' from source: role '' defaults 19110 1726882580.21967: variable 'ansible_distribution_major_version' from source: facts 19110 1726882580.21989: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19110 1726882580.22177: variable 'ansible_distribution' from source: facts 19110 1726882580.22185: variable '__network_rh_distros' from source: role '' defaults 19110 1726882580.22201: variable 'ansible_distribution_major_version' from source: facts 19110 1726882580.22263: variable 'network_provider' from source: set_fact 19110 1726882580.22296: variable 'omit' from source: magic vars 19110 1726882580.22336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882580.22373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882580.22396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882580.22425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882580.22444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882580.22485: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882580.22495: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882580.22503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882580.22613: Set connection var ansible_timeout to 10 19110 1726882580.22637: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882580.22651: Set connection var ansible_shell_executable to /bin/sh 19110 1726882580.22661: Set connection var ansible_shell_type to sh 19110 1726882580.22671: Set connection var ansible_connection to ssh 19110 1726882580.22679: Set connection var ansible_pipelining to False 19110 1726882580.22704: variable 'ansible_shell_executable' from source: unknown 19110 1726882580.22711: variable 'ansible_connection' from source: unknown 19110 1726882580.22718: variable 'ansible_module_compression' from source: unknown 19110 1726882580.22723: variable 'ansible_shell_type' from source: unknown 19110 1726882580.22729: variable 'ansible_shell_executable' from source: unknown 19110 1726882580.22736: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882580.22752: variable 'ansible_pipelining' from source: unknown 19110 1726882580.22765: variable 'ansible_timeout' from source: unknown 19110 1726882580.22774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882580.22888: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882580.22903: variable 'omit' from source: magic vars 19110 1726882580.22912: starting attempt loop 19110 1726882580.22918: running the handler 19110 1726882580.23005: variable 'ansible_facts' from source: unknown 19110 1726882580.23997: _low_level_execute_command(): starting 19110 1726882580.24012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882580.24779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.24795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.24818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.24838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.24886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.24900: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.24919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.24941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.24958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.24973: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.24987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.25001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.25018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.25039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.25052: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.25073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.25157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.25182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.25197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.25324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.26994: stdout chunk (state=3): >>>/root <<< 19110 1726882580.27182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882580.27185: stdout chunk (state=3): >>><<< 19110 1726882580.27188: stderr chunk (state=3): >>><<< 19110 1726882580.27278: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882580.27284: _low_level_execute_command(): starting 19110 1726882580.27287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689 `" && echo ansible-tmp-1726882580.2720304-20761-17974354054689="` echo /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689 `" ) && sleep 0' 19110 1726882580.27873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.27886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.27899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.27915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.27965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.27978: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.27991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.28006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.28016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.28025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.28044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.28079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.28097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.28114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.28125: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.28136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.28218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.28235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.28250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.28384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.30244: stdout chunk (state=3): >>>ansible-tmp-1726882580.2720304-20761-17974354054689=/root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689 <<< 19110 1726882580.30357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882580.30404: stderr chunk (state=3): >>><<< 19110 1726882580.30406: stdout chunk (state=3): >>><<< 19110 1726882580.30437: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882580.2720304-20761-17974354054689=/root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882580.30443: variable 'ansible_module_compression' from source: unknown 19110 1726882580.30487: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19110 1726882580.30536: variable 'ansible_facts' from source: unknown 19110 1726882580.30676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/AnsiballZ_systemd.py 19110 1726882580.30777: Sending initial data 19110 1726882580.30781: Sent initial data (155 bytes) 19110 1726882580.31583: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.31586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.31597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.31613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.31669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.31672: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.31675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.31904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.31908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.31910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.31912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.31914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.31915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.31917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.31919: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.31921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.31922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.31924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.31953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.32113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.33799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882580.33889: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882580.33984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpxai3uo6q /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/AnsiballZ_systemd.py <<< 19110 1726882580.34080: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882580.36638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882580.36779: stderr chunk (state=3): >>><<< 19110 1726882580.36782: stdout chunk (state=3): >>><<< 19110 1726882580.36784: done transferring module to remote 19110 1726882580.36786: _low_level_execute_command(): starting 19110 1726882580.36788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/ /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/AnsiballZ_systemd.py && sleep 0' 19110 1726882580.37633: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.37647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.37668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.37703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.37767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.37786: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.37812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.37832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.37847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.37863: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.37879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.37894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.37911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.37933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.37949: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.37968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.38054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.38079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.38094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.38273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.39934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882580.40017: stderr chunk (state=3): >>><<< 19110 1726882580.40028: stdout chunk (state=3): >>><<< 19110 1726882580.40133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882580.40141: _low_level_execute_command(): starting 19110 1726882580.40144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/AnsiballZ_systemd.py && sleep 0' 19110 1726882580.40585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.40591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.40599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.40608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.40635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.40642: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.40650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.40660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.40668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.40677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.40685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.40691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.40700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.40708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.40713: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.40718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.40772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.40782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.40787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.40905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.65651: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 19110 1726882580.65669: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "16117760", "MemoryAvailable": "infinity", "CPUUsageNSec": "972178000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 19110 1726882580.65692: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19110 1726882580.67223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882580.67227: stdout chunk (state=3): >>><<< 19110 1726882580.67234: stderr chunk (state=3): >>><<< 19110 1726882580.67253: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16117760", "MemoryAvailable": "infinity", "CPUUsageNSec": "972178000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882580.67472: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882580.67487: _low_level_execute_command(): starting 19110 1726882580.67492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882580.2720304-20761-17974354054689/ > /dev/null 2>&1 && sleep 0' 19110 1726882580.68924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882580.68933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.68948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.68967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.69002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.69009: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882580.69019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.69035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882580.69038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882580.69047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882580.69059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882580.69073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882580.69084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882580.69091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882580.69098: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882580.69110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882580.69184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882580.69198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882580.69207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882580.69344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882580.71195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882580.71198: stdout chunk (state=3): >>><<< 19110 1726882580.71200: stderr chunk (state=3): >>><<< 19110 1726882580.71269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882580.71273: handler run complete 19110 1726882580.71474: attempt loop complete, returning result 19110 1726882580.71477: _execute() done 19110 1726882580.71479: dumping result to json 19110 1726882580.71481: done dumping result, returning 19110 1726882580.71483: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-5372-c19a-000000000068] 19110 1726882580.71485: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000068 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882580.71692: no more pending results, returning what we have 19110 1726882580.71695: results queue empty 19110 1726882580.71696: checking for any_errors_fatal 19110 1726882580.71704: done checking for any_errors_fatal 19110 1726882580.71705: checking for max_fail_percentage 19110 1726882580.71707: done checking for max_fail_percentage 19110 1726882580.71708: checking to see if all hosts have failed and the running result is not ok 19110 1726882580.71708: done checking to see if all hosts have failed 19110 1726882580.71709: getting the remaining hosts for this loop 19110 1726882580.71711: done getting the remaining hosts for this loop 19110 1726882580.71714: getting the next task for host managed_node1 19110 1726882580.71720: done getting next task for host managed_node1 19110 1726882580.71724: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882580.71726: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882580.71736: getting variables 19110 1726882580.71738: in VariableManager get_vars() 19110 1726882580.71780: Calling all_inventory to load vars for managed_node1 19110 1726882580.71784: Calling groups_inventory to load vars for managed_node1 19110 1726882580.71786: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882580.71797: Calling all_plugins_play to load vars for managed_node1 19110 1726882580.71799: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882580.71802: Calling groups_plugins_play to load vars for managed_node1 19110 1726882580.72580: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000068 19110 1726882580.72584: WORKER PROCESS EXITING 19110 1726882580.75681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882580.78498: done with get_vars() 19110 1726882580.78533: done getting variables 19110 1726882580.78716: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:20 -0400 (0:00:00.686) 0:00:37.644 ****** 19110 1726882580.78752: entering _queue_task() for managed_node1/service 19110 1726882580.79525: worker is 1 (out of 1 available) 19110 1726882580.79540: exiting _queue_task() for managed_node1/service 19110 1726882580.79553: done queuing things up, now waiting for results queue to drain 19110 1726882580.79557: waiting for pending results... 19110 1726882580.80521: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19110 1726882580.80750: in run() - task 0e448fcc-3ce9-5372-c19a-000000000069 19110 1726882580.80894: variable 'ansible_search_path' from source: unknown 19110 1726882580.80901: variable 'ansible_search_path' from source: unknown 19110 1726882580.80943: calling self._execute() 19110 1726882580.81152: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882580.81182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882580.81197: variable 'omit' from source: magic vars 19110 1726882580.82104: variable 'ansible_distribution_major_version' from source: facts 19110 1726882580.82122: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882580.82368: variable 'network_provider' from source: set_fact 19110 1726882580.82379: Evaluated conditional (network_provider == "nm"): True 19110 1726882580.82492: variable '__network_wpa_supplicant_required' from source: role '' defaults 19110 1726882580.82717: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19110 1726882580.83147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882580.88975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882580.89163: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882580.89212: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882580.89291: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882580.89322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882580.89456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.89612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.89644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.89752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.89791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.89910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.89987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.90016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.90085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.90167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.90298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882580.90326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882580.90390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.90511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882580.90532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882580.90829: variable 'network_connections' from source: play vars 19110 1726882580.90925: variable 'profile' from source: play vars 19110 1726882580.91065: variable 'profile' from source: play vars 19110 1726882580.91138: variable 'interface' from source: set_fact 19110 1726882580.91229: variable 'interface' from source: set_fact 19110 1726882580.91422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19110 1726882580.91825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19110 1726882580.91891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19110 1726882580.91928: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19110 1726882580.92029: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19110 1726882580.92079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19110 1726882580.92192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19110 1726882580.92331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882580.92366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19110 1726882580.92417: variable '__network_wireless_connections_defined' from source: role '' defaults 19110 1726882580.92995: variable 'network_connections' from source: play vars 19110 1726882580.93004: variable 'profile' from source: play vars 19110 1726882580.93067: variable 'profile' from source: play vars 19110 1726882580.93088: variable 'interface' from source: set_fact 19110 1726882580.93149: variable 'interface' from source: set_fact 19110 1726882580.93332: Evaluated conditional (__network_wpa_supplicant_required): False 19110 1726882580.93339: when evaluation is False, skipping this task 19110 1726882580.93346: _execute() done 19110 1726882580.93361: dumping result to json 19110 1726882580.93372: done dumping result, returning 19110 1726882580.93385: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-5372-c19a-000000000069] 19110 1726882580.93395: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19110 1726882580.93569: no more pending results, returning what we have 19110 1726882580.93573: results queue empty 19110 1726882580.93574: checking for any_errors_fatal 19110 1726882580.93592: done checking for any_errors_fatal 19110 1726882580.93593: checking for max_fail_percentage 19110 1726882580.93595: done checking for max_fail_percentage 19110 1726882580.93596: checking to see if all hosts have failed and the running result is not ok 19110 1726882580.93596: done checking to see if all hosts have failed 19110 1726882580.93597: getting the remaining hosts for this loop 19110 1726882580.93599: done getting the remaining hosts for this loop 19110 1726882580.93603: getting the next task for host managed_node1 19110 1726882580.93609: done getting next task for host managed_node1 19110 1726882580.93613: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882580.93616: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882580.93630: getting variables 19110 1726882580.93632: in VariableManager get_vars() 19110 1726882580.93671: Calling all_inventory to load vars for managed_node1 19110 1726882580.93674: Calling groups_inventory to load vars for managed_node1 19110 1726882580.93677: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882580.93689: Calling all_plugins_play to load vars for managed_node1 19110 1726882580.93692: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882580.93695: Calling groups_plugins_play to load vars for managed_node1 19110 1726882580.94712: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000069 19110 1726882580.94715: WORKER PROCESS EXITING 19110 1726882580.96429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882580.99036: done with get_vars() 19110 1726882580.99062: done getting variables 19110 1726882580.99146: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:20 -0400 (0:00:00.204) 0:00:37.849 ****** 19110 1726882580.99190: entering _queue_task() for managed_node1/service 19110 1726882580.99568: worker is 1 (out of 1 available) 19110 1726882580.99588: exiting _queue_task() for managed_node1/service 19110 1726882580.99604: done queuing things up, now waiting for results queue to drain 19110 1726882580.99606: waiting for pending results... 19110 1726882580.99919: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 19110 1726882581.00049: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006a 19110 1726882581.00077: variable 'ansible_search_path' from source: unknown 19110 1726882581.00086: variable 'ansible_search_path' from source: unknown 19110 1726882581.00129: calling self._execute() 19110 1726882581.00243: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.00260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.00290: variable 'omit' from source: magic vars 19110 1726882581.00750: variable 'ansible_distribution_major_version' from source: facts 19110 1726882581.00773: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882581.00946: variable 'network_provider' from source: set_fact 19110 1726882581.00965: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882581.00978: when evaluation is False, skipping this task 19110 1726882581.00994: _execute() done 19110 1726882581.01006: dumping result to json 19110 1726882581.01017: done dumping result, returning 19110 1726882581.01036: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-5372-c19a-00000000006a] 19110 1726882581.01049: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19110 1726882581.01224: no more pending results, returning what we have 19110 1726882581.01228: results queue empty 19110 1726882581.01230: checking for any_errors_fatal 19110 1726882581.01244: done checking for any_errors_fatal 19110 1726882581.01247: checking for max_fail_percentage 19110 1726882581.01250: done checking for max_fail_percentage 19110 1726882581.01251: checking to see if all hosts have failed and the running result is not ok 19110 1726882581.01251: done checking to see if all hosts have failed 19110 1726882581.01252: getting the remaining hosts for this loop 19110 1726882581.01254: done getting the remaining hosts for this loop 19110 1726882581.01260: getting the next task for host managed_node1 19110 1726882581.01268: done getting next task for host managed_node1 19110 1726882581.01272: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882581.01275: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882581.01302: getting variables 19110 1726882581.01304: in VariableManager get_vars() 19110 1726882581.01352: Calling all_inventory to load vars for managed_node1 19110 1726882581.01361: Calling groups_inventory to load vars for managed_node1 19110 1726882581.01365: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882581.01380: Calling all_plugins_play to load vars for managed_node1 19110 1726882581.01384: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882581.01391: Calling groups_plugins_play to load vars for managed_node1 19110 1726882581.02548: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006a 19110 1726882581.02553: WORKER PROCESS EXITING 19110 1726882581.04430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882581.07714: done with get_vars() 19110 1726882581.07736: done getting variables 19110 1726882581.07806: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:21 -0400 (0:00:00.086) 0:00:37.935 ****** 19110 1726882581.07838: entering _queue_task() for managed_node1/copy 19110 1726882581.08207: worker is 1 (out of 1 available) 19110 1726882581.08218: exiting _queue_task() for managed_node1/copy 19110 1726882581.08234: done queuing things up, now waiting for results queue to drain 19110 1726882581.08235: waiting for pending results... 19110 1726882581.08535: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19110 1726882581.08683: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006b 19110 1726882581.08711: variable 'ansible_search_path' from source: unknown 19110 1726882581.08729: variable 'ansible_search_path' from source: unknown 19110 1726882581.08794: calling self._execute() 19110 1726882581.08910: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.08922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.08935: variable 'omit' from source: magic vars 19110 1726882581.09378: variable 'ansible_distribution_major_version' from source: facts 19110 1726882581.09396: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882581.09542: variable 'network_provider' from source: set_fact 19110 1726882581.09736: Evaluated conditional (network_provider == "initscripts"): False 19110 1726882581.09743: when evaluation is False, skipping this task 19110 1726882581.09751: _execute() done 19110 1726882581.09765: dumping result to json 19110 1726882581.09774: done dumping result, returning 19110 1726882581.09787: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-5372-c19a-00000000006b] 19110 1726882581.09799: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19110 1726882581.09952: no more pending results, returning what we have 19110 1726882581.09956: results queue empty 19110 1726882581.09958: checking for any_errors_fatal 19110 1726882581.09965: done checking for any_errors_fatal 19110 1726882581.09966: checking for max_fail_percentage 19110 1726882581.09968: done checking for max_fail_percentage 19110 1726882581.09969: checking to see if all hosts have failed and the running result is not ok 19110 1726882581.09970: done checking to see if all hosts have failed 19110 1726882581.09971: getting the remaining hosts for this loop 19110 1726882581.09972: done getting the remaining hosts for this loop 19110 1726882581.09977: getting the next task for host managed_node1 19110 1726882581.09984: done getting next task for host managed_node1 19110 1726882581.09987: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882581.09990: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882581.10005: getting variables 19110 1726882581.10007: in VariableManager get_vars() 19110 1726882581.10045: Calling all_inventory to load vars for managed_node1 19110 1726882581.10048: Calling groups_inventory to load vars for managed_node1 19110 1726882581.10051: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882581.10063: Calling all_plugins_play to load vars for managed_node1 19110 1726882581.10077: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882581.10082: Calling groups_plugins_play to load vars for managed_node1 19110 1726882581.11160: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006b 19110 1726882581.11165: WORKER PROCESS EXITING 19110 1726882581.12846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882581.14794: done with get_vars() 19110 1726882581.14823: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:21 -0400 (0:00:00.070) 0:00:38.006 ****** 19110 1726882581.14923: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882581.15242: worker is 1 (out of 1 available) 19110 1726882581.15257: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 19110 1726882581.15273: done queuing things up, now waiting for results queue to drain 19110 1726882581.15275: waiting for pending results... 19110 1726882581.15572: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19110 1726882581.15692: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006c 19110 1726882581.15720: variable 'ansible_search_path' from source: unknown 19110 1726882581.15731: variable 'ansible_search_path' from source: unknown 19110 1726882581.15780: calling self._execute() 19110 1726882581.15903: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.15916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.15945: variable 'omit' from source: magic vars 19110 1726882581.16432: variable 'ansible_distribution_major_version' from source: facts 19110 1726882581.16460: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882581.16480: variable 'omit' from source: magic vars 19110 1726882581.16543: variable 'omit' from source: magic vars 19110 1726882581.16779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19110 1726882581.19717: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19110 1726882581.19802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19110 1726882581.19849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19110 1726882581.19906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19110 1726882581.19941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19110 1726882581.20054: variable 'network_provider' from source: set_fact 19110 1726882581.20233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19110 1726882581.20275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19110 1726882581.20328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19110 1726882581.20386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19110 1726882581.20416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19110 1726882581.20521: variable 'omit' from source: magic vars 19110 1726882581.20666: variable 'omit' from source: magic vars 19110 1726882581.20785: variable 'network_connections' from source: play vars 19110 1726882581.20804: variable 'profile' from source: play vars 19110 1726882581.20892: variable 'profile' from source: play vars 19110 1726882581.20902: variable 'interface' from source: set_fact 19110 1726882581.20967: variable 'interface' from source: set_fact 19110 1726882581.21132: variable 'omit' from source: magic vars 19110 1726882581.21146: variable '__lsr_ansible_managed' from source: task vars 19110 1726882581.21223: variable '__lsr_ansible_managed' from source: task vars 19110 1726882581.21546: Loaded config def from plugin (lookup/template) 19110 1726882581.21558: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19110 1726882581.21599: File lookup term: get_ansible_managed.j2 19110 1726882581.21607: variable 'ansible_search_path' from source: unknown 19110 1726882581.21617: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19110 1726882581.21642: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19110 1726882581.21669: variable 'ansible_search_path' from source: unknown 19110 1726882581.28773: variable 'ansible_managed' from source: unknown 19110 1726882581.28948: variable 'omit' from source: magic vars 19110 1726882581.28988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882581.29031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882581.29055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882581.29090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882581.29107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882581.29150: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882581.29159: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.29170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.29280: Set connection var ansible_timeout to 10 19110 1726882581.29299: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882581.29309: Set connection var ansible_shell_executable to /bin/sh 19110 1726882581.29317: Set connection var ansible_shell_type to sh 19110 1726882581.29323: Set connection var ansible_connection to ssh 19110 1726882581.29337: Set connection var ansible_pipelining to False 19110 1726882581.29368: variable 'ansible_shell_executable' from source: unknown 19110 1726882581.29377: variable 'ansible_connection' from source: unknown 19110 1726882581.29384: variable 'ansible_module_compression' from source: unknown 19110 1726882581.29391: variable 'ansible_shell_type' from source: unknown 19110 1726882581.29398: variable 'ansible_shell_executable' from source: unknown 19110 1726882581.29405: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.29413: variable 'ansible_pipelining' from source: unknown 19110 1726882581.29420: variable 'ansible_timeout' from source: unknown 19110 1726882581.29432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.29585: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882581.29609: variable 'omit' from source: magic vars 19110 1726882581.29620: starting attempt loop 19110 1726882581.29627: running the handler 19110 1726882581.29644: _low_level_execute_command(): starting 19110 1726882581.29659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882581.30538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.30562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.30580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.30599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.30653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.30676: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.30692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.30711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.30723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.30735: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.30751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.30779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.30797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.30810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.30822: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.30835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.30935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.30953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.30975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.31226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.32795: stdout chunk (state=3): >>>/root <<< 19110 1726882581.32968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882581.32971: stdout chunk (state=3): >>><<< 19110 1726882581.32989: stderr chunk (state=3): >>><<< 19110 1726882581.33089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882581.33092: _low_level_execute_command(): starting 19110 1726882581.33095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035 `" && echo ansible-tmp-1726882581.33006-20807-133346428478035="` echo /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035 `" ) && sleep 0' 19110 1726882581.33676: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.33690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.33703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.33720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.33783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.33800: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.33820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.33847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.33870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.33884: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.33897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.33913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.33933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.33947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.33961: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.33981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.34054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.34077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.34095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.34235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.36090: stdout chunk (state=3): >>>ansible-tmp-1726882581.33006-20807-133346428478035=/root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035 <<< 19110 1726882581.36202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882581.36273: stderr chunk (state=3): >>><<< 19110 1726882581.36279: stdout chunk (state=3): >>><<< 19110 1726882581.36301: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882581.33006-20807-133346428478035=/root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882581.36346: variable 'ansible_module_compression' from source: unknown 19110 1726882581.36393: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19110 1726882581.36425: variable 'ansible_facts' from source: unknown 19110 1726882581.36722: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/AnsiballZ_network_connections.py 19110 1726882581.36865: Sending initial data 19110 1726882581.36870: Sent initial data (166 bytes) 19110 1726882581.40014: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.40127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.40137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.40159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.40203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.40225: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.40236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.40251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.40279: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.40286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.40294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.40303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.40315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.40376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.40384: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.40394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.40465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.40591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.40595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.40809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.42559: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882581.42649: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882581.42745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpcnxik_h4 /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/AnsiballZ_network_connections.py <<< 19110 1726882581.42837: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882581.45169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882581.45233: stderr chunk (state=3): >>><<< 19110 1726882581.45237: stdout chunk (state=3): >>><<< 19110 1726882581.45260: done transferring module to remote 19110 1726882581.45270: _low_level_execute_command(): starting 19110 1726882581.45272: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/ /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/AnsiballZ_network_connections.py && sleep 0' 19110 1726882581.47917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.47925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.47949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.47993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.47996: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.48005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.48019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.48026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.48032: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.48040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.48049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.48060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.48069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.48075: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.48085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.48168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.48175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.48185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.48468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.50233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882581.50236: stdout chunk (state=3): >>><<< 19110 1726882581.50245: stderr chunk (state=3): >>><<< 19110 1726882581.50265: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882581.50276: _low_level_execute_command(): starting 19110 1726882581.50279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/AnsiballZ_network_connections.py && sleep 0' 19110 1726882581.52125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.52207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.52216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.52229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.52286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.52437: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.52446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.52463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.52496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.52502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.52510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.52519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.52530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.52540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.52543: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.52553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.52708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.52769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.52787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.52948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.76532: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lbw_7d3h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 19110 1726882581.76537: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lbw_7d3h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/264de621-b20e-42af-8432-5f491fad83e4: error=unknown <<< 19110 1726882581.76684: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19110 1726882581.78483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882581.78487: stdout chunk (state=3): >>><<< 19110 1726882581.78494: stderr chunk (state=3): >>><<< 19110 1726882581.78513: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lbw_7d3h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_lbw_7d3h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/264de621-b20e-42af-8432-5f491fad83e4: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882581.78551: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882581.78568: _low_level_execute_command(): starting 19110 1726882581.78573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882581.33006-20807-133346428478035/ > /dev/null 2>&1 && sleep 0' 19110 1726882581.80223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882581.80244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.80260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.80286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.80361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.80414: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882581.80432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.80454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882581.80468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882581.80479: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882581.80490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882581.80502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882581.80524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882581.80539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882581.80553: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882581.80569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882581.80700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882581.80765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882581.80785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882581.80957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882581.82858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882581.82862: stdout chunk (state=3): >>><<< 19110 1726882581.82866: stderr chunk (state=3): >>><<< 19110 1726882581.83077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882581.83080: handler run complete 19110 1726882581.83083: attempt loop complete, returning result 19110 1726882581.83085: _execute() done 19110 1726882581.83087: dumping result to json 19110 1726882581.83089: done dumping result, returning 19110 1726882581.83091: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-5372-c19a-00000000006c] 19110 1726882581.83094: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006c 19110 1726882581.83172: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006c 19110 1726882581.83177: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19110 1726882581.83281: no more pending results, returning what we have 19110 1726882581.83285: results queue empty 19110 1726882581.83288: checking for any_errors_fatal 19110 1726882581.83293: done checking for any_errors_fatal 19110 1726882581.83294: checking for max_fail_percentage 19110 1726882581.83296: done checking for max_fail_percentage 19110 1726882581.83297: checking to see if all hosts have failed and the running result is not ok 19110 1726882581.83298: done checking to see if all hosts have failed 19110 1726882581.83299: getting the remaining hosts for this loop 19110 1726882581.83300: done getting the remaining hosts for this loop 19110 1726882581.83304: getting the next task for host managed_node1 19110 1726882581.83310: done getting next task for host managed_node1 19110 1726882581.83313: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882581.83315: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882581.83325: getting variables 19110 1726882581.83327: in VariableManager get_vars() 19110 1726882581.83366: Calling all_inventory to load vars for managed_node1 19110 1726882581.83370: Calling groups_inventory to load vars for managed_node1 19110 1726882581.83373: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882581.83385: Calling all_plugins_play to load vars for managed_node1 19110 1726882581.83389: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882581.83392: Calling groups_plugins_play to load vars for managed_node1 19110 1726882581.86371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882581.88431: done with get_vars() 19110 1726882581.88453: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:21 -0400 (0:00:00.736) 0:00:38.742 ****** 19110 1726882581.88548: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882581.88887: worker is 1 (out of 1 available) 19110 1726882581.88901: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 19110 1726882581.88916: done queuing things up, now waiting for results queue to drain 19110 1726882581.88918: waiting for pending results... 19110 1726882581.89215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 19110 1726882581.89347: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006d 19110 1726882581.89383: variable 'ansible_search_path' from source: unknown 19110 1726882581.89392: variable 'ansible_search_path' from source: unknown 19110 1726882581.89433: calling self._execute() 19110 1726882581.89538: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.89549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.89569: variable 'omit' from source: magic vars 19110 1726882581.89977: variable 'ansible_distribution_major_version' from source: facts 19110 1726882581.89995: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882581.90142: variable 'network_state' from source: role '' defaults 19110 1726882581.90158: Evaluated conditional (network_state != {}): False 19110 1726882581.90170: when evaluation is False, skipping this task 19110 1726882581.90177: _execute() done 19110 1726882581.90184: dumping result to json 19110 1726882581.90191: done dumping result, returning 19110 1726882581.90200: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-5372-c19a-00000000006d] 19110 1726882581.90211: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006d 19110 1726882581.90330: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19110 1726882581.90395: no more pending results, returning what we have 19110 1726882581.90399: results queue empty 19110 1726882581.90400: checking for any_errors_fatal 19110 1726882581.90411: done checking for any_errors_fatal 19110 1726882581.90411: checking for max_fail_percentage 19110 1726882581.90413: done checking for max_fail_percentage 19110 1726882581.90414: checking to see if all hosts have failed and the running result is not ok 19110 1726882581.90415: done checking to see if all hosts have failed 19110 1726882581.90416: getting the remaining hosts for this loop 19110 1726882581.90417: done getting the remaining hosts for this loop 19110 1726882581.90421: getting the next task for host managed_node1 19110 1726882581.90427: done getting next task for host managed_node1 19110 1726882581.90431: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882581.90434: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882581.90453: getting variables 19110 1726882581.90455: in VariableManager get_vars() 19110 1726882581.90495: Calling all_inventory to load vars for managed_node1 19110 1726882581.90499: Calling groups_inventory to load vars for managed_node1 19110 1726882581.90501: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882581.90514: Calling all_plugins_play to load vars for managed_node1 19110 1726882581.90518: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882581.90521: Calling groups_plugins_play to load vars for managed_node1 19110 1726882581.91558: WORKER PROCESS EXITING 19110 1726882581.92500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882581.95653: done with get_vars() 19110 1726882581.95681: done getting variables 19110 1726882581.95750: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:21 -0400 (0:00:00.072) 0:00:38.815 ****** 19110 1726882581.95783: entering _queue_task() for managed_node1/debug 19110 1726882581.96103: worker is 1 (out of 1 available) 19110 1726882581.96117: exiting _queue_task() for managed_node1/debug 19110 1726882581.96128: done queuing things up, now waiting for results queue to drain 19110 1726882581.96130: waiting for pending results... 19110 1726882581.96415: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19110 1726882581.96531: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006e 19110 1726882581.96552: variable 'ansible_search_path' from source: unknown 19110 1726882581.96559: variable 'ansible_search_path' from source: unknown 19110 1726882581.96611: calling self._execute() 19110 1726882581.96713: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.96724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.96737: variable 'omit' from source: magic vars 19110 1726882581.97133: variable 'ansible_distribution_major_version' from source: facts 19110 1726882581.97153: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882581.97165: variable 'omit' from source: magic vars 19110 1726882581.97202: variable 'omit' from source: magic vars 19110 1726882581.97249: variable 'omit' from source: magic vars 19110 1726882581.97294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882581.97336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882581.97368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882581.97389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882581.97403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882581.97434: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882581.97444: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.97455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.97573: Set connection var ansible_timeout to 10 19110 1726882581.97592: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882581.97601: Set connection var ansible_shell_executable to /bin/sh 19110 1726882581.97608: Set connection var ansible_shell_type to sh 19110 1726882581.97614: Set connection var ansible_connection to ssh 19110 1726882581.97623: Set connection var ansible_pipelining to False 19110 1726882581.97646: variable 'ansible_shell_executable' from source: unknown 19110 1726882581.97674: variable 'ansible_connection' from source: unknown 19110 1726882581.97686: variable 'ansible_module_compression' from source: unknown 19110 1726882581.97692: variable 'ansible_shell_type' from source: unknown 19110 1726882581.97697: variable 'ansible_shell_executable' from source: unknown 19110 1726882581.97703: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882581.97709: variable 'ansible_pipelining' from source: unknown 19110 1726882581.97714: variable 'ansible_timeout' from source: unknown 19110 1726882581.97720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882581.97858: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882581.97888: variable 'omit' from source: magic vars 19110 1726882581.97908: starting attempt loop 19110 1726882581.97922: running the handler 19110 1726882581.98260: variable '__network_connections_result' from source: set_fact 19110 1726882581.98321: handler run complete 19110 1726882581.98349: attempt loop complete, returning result 19110 1726882581.98361: _execute() done 19110 1726882581.98372: dumping result to json 19110 1726882581.98388: done dumping result, returning 19110 1726882581.98403: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000006e] 19110 1726882581.98414: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006e ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 19110 1726882581.98573: no more pending results, returning what we have 19110 1726882581.98579: results queue empty 19110 1726882581.98580: checking for any_errors_fatal 19110 1726882581.98586: done checking for any_errors_fatal 19110 1726882581.98587: checking for max_fail_percentage 19110 1726882581.98589: done checking for max_fail_percentage 19110 1726882581.98590: checking to see if all hosts have failed and the running result is not ok 19110 1726882581.98591: done checking to see if all hosts have failed 19110 1726882581.98592: getting the remaining hosts for this loop 19110 1726882581.98593: done getting the remaining hosts for this loop 19110 1726882581.98597: getting the next task for host managed_node1 19110 1726882581.98604: done getting next task for host managed_node1 19110 1726882581.98608: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882581.98610: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882581.98620: getting variables 19110 1726882581.98622: in VariableManager get_vars() 19110 1726882581.98671: Calling all_inventory to load vars for managed_node1 19110 1726882581.98675: Calling groups_inventory to load vars for managed_node1 19110 1726882581.98678: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882581.98690: Calling all_plugins_play to load vars for managed_node1 19110 1726882581.98693: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882581.98696: Calling groups_plugins_play to load vars for managed_node1 19110 1726882581.99730: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006e 19110 1726882581.99733: WORKER PROCESS EXITING 19110 1726882582.00574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.01762: done with get_vars() 19110 1726882582.01779: done getting variables 19110 1726882582.01822: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:22 -0400 (0:00:00.060) 0:00:38.875 ****** 19110 1726882582.01845: entering _queue_task() for managed_node1/debug 19110 1726882582.02064: worker is 1 (out of 1 available) 19110 1726882582.02079: exiting _queue_task() for managed_node1/debug 19110 1726882582.02093: done queuing things up, now waiting for results queue to drain 19110 1726882582.02094: waiting for pending results... 19110 1726882582.02276: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19110 1726882582.02347: in run() - task 0e448fcc-3ce9-5372-c19a-00000000006f 19110 1726882582.02364: variable 'ansible_search_path' from source: unknown 19110 1726882582.02369: variable 'ansible_search_path' from source: unknown 19110 1726882582.02396: calling self._execute() 19110 1726882582.02493: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.02499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.02501: variable 'omit' from source: magic vars 19110 1726882582.02957: variable 'ansible_distribution_major_version' from source: facts 19110 1726882582.02980: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882582.02984: variable 'omit' from source: magic vars 19110 1726882582.03073: variable 'omit' from source: magic vars 19110 1726882582.03120: variable 'omit' from source: magic vars 19110 1726882582.03211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882582.03303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882582.03306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882582.03441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.03456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.03488: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882582.03496: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.03616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.03917: Set connection var ansible_timeout to 10 19110 1726882582.03939: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882582.03953: Set connection var ansible_shell_executable to /bin/sh 19110 1726882582.03960: Set connection var ansible_shell_type to sh 19110 1726882582.03970: Set connection var ansible_connection to ssh 19110 1726882582.03980: Set connection var ansible_pipelining to False 19110 1726882582.04013: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.04021: variable 'ansible_connection' from source: unknown 19110 1726882582.04027: variable 'ansible_module_compression' from source: unknown 19110 1726882582.04033: variable 'ansible_shell_type' from source: unknown 19110 1726882582.04043: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.04055: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.04064: variable 'ansible_pipelining' from source: unknown 19110 1726882582.04071: variable 'ansible_timeout' from source: unknown 19110 1726882582.04082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.04682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882582.04705: variable 'omit' from source: magic vars 19110 1726882582.04708: starting attempt loop 19110 1726882582.04725: running the handler 19110 1726882582.05036: variable '__network_connections_result' from source: set_fact 19110 1726882582.05039: variable '__network_connections_result' from source: set_fact 19110 1726882582.05042: handler run complete 19110 1726882582.05044: attempt loop complete, returning result 19110 1726882582.05046: _execute() done 19110 1726882582.05048: dumping result to json 19110 1726882582.05050: done dumping result, returning 19110 1726882582.05052: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-5372-c19a-00000000006f] 19110 1726882582.05054: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006f 19110 1726882582.05230: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000006f 19110 1726882582.05233: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19110 1726882582.05618: no more pending results, returning what we have 19110 1726882582.05621: results queue empty 19110 1726882582.05622: checking for any_errors_fatal 19110 1726882582.05627: done checking for any_errors_fatal 19110 1726882582.05628: checking for max_fail_percentage 19110 1726882582.05630: done checking for max_fail_percentage 19110 1726882582.05631: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.05632: done checking to see if all hosts have failed 19110 1726882582.05632: getting the remaining hosts for this loop 19110 1726882582.05653: done getting the remaining hosts for this loop 19110 1726882582.05704: getting the next task for host managed_node1 19110 1726882582.05757: done getting next task for host managed_node1 19110 1726882582.05761: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882582.05765: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.05801: getting variables 19110 1726882582.05803: in VariableManager get_vars() 19110 1726882582.05836: Calling all_inventory to load vars for managed_node1 19110 1726882582.05839: Calling groups_inventory to load vars for managed_node1 19110 1726882582.05841: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.05850: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.05860: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.05867: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.07773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.08980: done with get_vars() 19110 1726882582.09002: done getting variables 19110 1726882582.09074: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:22 -0400 (0:00:00.072) 0:00:38.948 ****** 19110 1726882582.09111: entering _queue_task() for managed_node1/debug 19110 1726882582.09380: worker is 1 (out of 1 available) 19110 1726882582.09396: exiting _queue_task() for managed_node1/debug 19110 1726882582.09411: done queuing things up, now waiting for results queue to drain 19110 1726882582.09413: waiting for pending results... 19110 1726882582.09783: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19110 1726882582.09861: in run() - task 0e448fcc-3ce9-5372-c19a-000000000070 19110 1726882582.09874: variable 'ansible_search_path' from source: unknown 19110 1726882582.09878: variable 'ansible_search_path' from source: unknown 19110 1726882582.09915: calling self._execute() 19110 1726882582.09989: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.09993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.10001: variable 'omit' from source: magic vars 19110 1726882582.10292: variable 'ansible_distribution_major_version' from source: facts 19110 1726882582.10302: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882582.10392: variable 'network_state' from source: role '' defaults 19110 1726882582.10398: Evaluated conditional (network_state != {}): False 19110 1726882582.10401: when evaluation is False, skipping this task 19110 1726882582.10404: _execute() done 19110 1726882582.10407: dumping result to json 19110 1726882582.10409: done dumping result, returning 19110 1726882582.10416: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-5372-c19a-000000000070] 19110 1726882582.10423: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000070 19110 1726882582.10505: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000070 19110 1726882582.10509: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 19110 1726882582.10584: no more pending results, returning what we have 19110 1726882582.10588: results queue empty 19110 1726882582.10589: checking for any_errors_fatal 19110 1726882582.10594: done checking for any_errors_fatal 19110 1726882582.10595: checking for max_fail_percentage 19110 1726882582.10597: done checking for max_fail_percentage 19110 1726882582.10598: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.10598: done checking to see if all hosts have failed 19110 1726882582.10599: getting the remaining hosts for this loop 19110 1726882582.10600: done getting the remaining hosts for this loop 19110 1726882582.10603: getting the next task for host managed_node1 19110 1726882582.10607: done getting next task for host managed_node1 19110 1726882582.10610: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882582.10613: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.10632: getting variables 19110 1726882582.10633: in VariableManager get_vars() 19110 1726882582.10658: Calling all_inventory to load vars for managed_node1 19110 1726882582.10660: Calling groups_inventory to load vars for managed_node1 19110 1726882582.10662: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.10670: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.10672: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.10674: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.11459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.12967: done with get_vars() 19110 1726882582.12982: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:22 -0400 (0:00:00.039) 0:00:38.987 ****** 19110 1726882582.13044: entering _queue_task() for managed_node1/ping 19110 1726882582.13244: worker is 1 (out of 1 available) 19110 1726882582.13261: exiting _queue_task() for managed_node1/ping 19110 1726882582.13274: done queuing things up, now waiting for results queue to drain 19110 1726882582.13276: waiting for pending results... 19110 1726882582.13569: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 19110 1726882582.13682: in run() - task 0e448fcc-3ce9-5372-c19a-000000000071 19110 1726882582.13708: variable 'ansible_search_path' from source: unknown 19110 1726882582.13712: variable 'ansible_search_path' from source: unknown 19110 1726882582.13753: calling self._execute() 19110 1726882582.13834: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.13845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.13849: variable 'omit' from source: magic vars 19110 1726882582.14232: variable 'ansible_distribution_major_version' from source: facts 19110 1726882582.14242: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882582.14248: variable 'omit' from source: magic vars 19110 1726882582.14291: variable 'omit' from source: magic vars 19110 1726882582.14336: variable 'omit' from source: magic vars 19110 1726882582.14382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882582.14410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882582.14426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882582.14453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.14461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.14492: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882582.14502: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.14504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.14569: Set connection var ansible_timeout to 10 19110 1726882582.14579: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882582.14584: Set connection var ansible_shell_executable to /bin/sh 19110 1726882582.14586: Set connection var ansible_shell_type to sh 19110 1726882582.14589: Set connection var ansible_connection to ssh 19110 1726882582.14594: Set connection var ansible_pipelining to False 19110 1726882582.14612: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.14615: variable 'ansible_connection' from source: unknown 19110 1726882582.14621: variable 'ansible_module_compression' from source: unknown 19110 1726882582.14624: variable 'ansible_shell_type' from source: unknown 19110 1726882582.14627: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.14629: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.14631: variable 'ansible_pipelining' from source: unknown 19110 1726882582.14633: variable 'ansible_timeout' from source: unknown 19110 1726882582.14635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.14826: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882582.14874: variable 'omit' from source: magic vars 19110 1726882582.14879: starting attempt loop 19110 1726882582.14882: running the handler 19110 1726882582.14884: _low_level_execute_command(): starting 19110 1726882582.14886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882582.15573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882582.15576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.15586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.15612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.15671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.15687: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882582.15693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.15722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882582.15735: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882582.15738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882582.15747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.15773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.15777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.15795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.15798: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882582.15803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.15868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.15894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.16012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.17649: stdout chunk (state=3): >>>/root <<< 19110 1726882582.17755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.17859: stderr chunk (state=3): >>><<< 19110 1726882582.17863: stdout chunk (state=3): >>><<< 19110 1726882582.17868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882582.17871: _low_level_execute_command(): starting 19110 1726882582.17874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095 `" && echo ansible-tmp-1726882582.1782417-20851-111329668827095="` echo /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095 `" ) && sleep 0' 19110 1726882582.18433: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.18456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.18484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.18538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.18554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.18690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.20530: stdout chunk (state=3): >>>ansible-tmp-1726882582.1782417-20851-111329668827095=/root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095 <<< 19110 1726882582.20655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.20741: stderr chunk (state=3): >>><<< 19110 1726882582.20747: stdout chunk (state=3): >>><<< 19110 1726882582.20779: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.1782417-20851-111329668827095=/root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882582.20804: variable 'ansible_module_compression' from source: unknown 19110 1726882582.20858: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19110 1726882582.20879: variable 'ansible_facts' from source: unknown 19110 1726882582.20926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/AnsiballZ_ping.py 19110 1726882582.21085: Sending initial data 19110 1726882582.21088: Sent initial data (153 bytes) 19110 1726882582.21803: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.21806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.21848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882582.21852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.21857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.21931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882582.21934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.22048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.23740: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19110 1726882582.23742: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882582.23835: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882582.23924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpee1teykb /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/AnsiballZ_ping.py <<< 19110 1726882582.24018: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882582.25079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.25170: stderr chunk (state=3): >>><<< 19110 1726882582.25173: stdout chunk (state=3): >>><<< 19110 1726882582.25188: done transferring module to remote 19110 1726882582.25199: _low_level_execute_command(): starting 19110 1726882582.25202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/ /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/AnsiballZ_ping.py && sleep 0' 19110 1726882582.26190: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882582.26227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.26256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.26293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.26381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.26407: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882582.26437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.26478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882582.26501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882582.26518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882582.26543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.26576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.26602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.26620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.26640: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882582.26662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.26755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.26759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.26860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.28579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.28626: stderr chunk (state=3): >>><<< 19110 1726882582.28629: stdout chunk (state=3): >>><<< 19110 1726882582.28643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882582.28646: _low_level_execute_command(): starting 19110 1726882582.28651: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/AnsiballZ_ping.py && sleep 0' 19110 1726882582.29253: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882582.29267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.29280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.29293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.29328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.29334: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882582.29343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.29356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882582.29367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882582.29375: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882582.29381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.29390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.29401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.29408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.29414: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882582.29424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.29497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882582.29513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.29521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.29692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.42537: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19110 1726882582.43601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882582.43605: stdout chunk (state=3): >>><<< 19110 1726882582.43608: stderr chunk (state=3): >>><<< 19110 1726882582.43671: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882582.43675: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882582.43678: _low_level_execute_command(): starting 19110 1726882582.43766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.1782417-20851-111329668827095/ > /dev/null 2>&1 && sleep 0' 19110 1726882582.45126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.45129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.46013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.46020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.46022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.46075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882582.46573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.46577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.46673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.48510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.48584: stderr chunk (state=3): >>><<< 19110 1726882582.48587: stdout chunk (state=3): >>><<< 19110 1726882582.48775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882582.48778: handler run complete 19110 1726882582.48781: attempt loop complete, returning result 19110 1726882582.48782: _execute() done 19110 1726882582.48784: dumping result to json 19110 1726882582.48786: done dumping result, returning 19110 1726882582.48788: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-5372-c19a-000000000071] 19110 1726882582.48789: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000071 19110 1726882582.48859: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000071 19110 1726882582.48862: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 19110 1726882582.49034: no more pending results, returning what we have 19110 1726882582.49038: results queue empty 19110 1726882582.49039: checking for any_errors_fatal 19110 1726882582.49044: done checking for any_errors_fatal 19110 1726882582.49045: checking for max_fail_percentage 19110 1726882582.49047: done checking for max_fail_percentage 19110 1726882582.49048: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.49049: done checking to see if all hosts have failed 19110 1726882582.49050: getting the remaining hosts for this loop 19110 1726882582.49052: done getting the remaining hosts for this loop 19110 1726882582.49058: getting the next task for host managed_node1 19110 1726882582.49069: done getting next task for host managed_node1 19110 1726882582.49072: ^ task is: TASK: meta (role_complete) 19110 1726882582.49074: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.49086: getting variables 19110 1726882582.49087: in VariableManager get_vars() 19110 1726882582.49131: Calling all_inventory to load vars for managed_node1 19110 1726882582.49135: Calling groups_inventory to load vars for managed_node1 19110 1726882582.49137: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.49149: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.49152: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.49159: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.53223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.57271: done with get_vars() 19110 1726882582.57302: done getting variables 19110 1726882582.57508: done queuing things up, now waiting for results queue to drain 19110 1726882582.57511: results queue empty 19110 1726882582.57511: checking for any_errors_fatal 19110 1726882582.57515: done checking for any_errors_fatal 19110 1726882582.57515: checking for max_fail_percentage 19110 1726882582.57516: done checking for max_fail_percentage 19110 1726882582.57517: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.57518: done checking to see if all hosts have failed 19110 1726882582.57519: getting the remaining hosts for this loop 19110 1726882582.57520: done getting the remaining hosts for this loop 19110 1726882582.57522: getting the next task for host managed_node1 19110 1726882582.57526: done getting next task for host managed_node1 19110 1726882582.57527: ^ task is: TASK: meta (flush_handlers) 19110 1726882582.57529: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.57531: getting variables 19110 1726882582.57532: in VariableManager get_vars() 19110 1726882582.57545: Calling all_inventory to load vars for managed_node1 19110 1726882582.57547: Calling groups_inventory to load vars for managed_node1 19110 1726882582.57549: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.57553: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.57559: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.57675: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.60215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.63092: done with get_vars() 19110 1726882582.63119: done getting variables 19110 1726882582.63179: in VariableManager get_vars() 19110 1726882582.63192: Calling all_inventory to load vars for managed_node1 19110 1726882582.63194: Calling groups_inventory to load vars for managed_node1 19110 1726882582.63196: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.63201: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.63203: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.63205: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.64702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.71949: done with get_vars() 19110 1726882582.71978: done queuing things up, now waiting for results queue to drain 19110 1726882582.71981: results queue empty 19110 1726882582.71981: checking for any_errors_fatal 19110 1726882582.71982: done checking for any_errors_fatal 19110 1726882582.71987: checking for max_fail_percentage 19110 1726882582.71988: done checking for max_fail_percentage 19110 1726882582.71988: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.71989: done checking to see if all hosts have failed 19110 1726882582.71990: getting the remaining hosts for this loop 19110 1726882582.71991: done getting the remaining hosts for this loop 19110 1726882582.71993: getting the next task for host managed_node1 19110 1726882582.71996: done getting next task for host managed_node1 19110 1726882582.71997: ^ task is: TASK: meta (flush_handlers) 19110 1726882582.71998: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.72001: getting variables 19110 1726882582.72001: in VariableManager get_vars() 19110 1726882582.72011: Calling all_inventory to load vars for managed_node1 19110 1726882582.72014: Calling groups_inventory to load vars for managed_node1 19110 1726882582.72015: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.72020: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.72022: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.72024: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.73524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.75419: done with get_vars() 19110 1726882582.75445: done getting variables 19110 1726882582.75499: in VariableManager get_vars() 19110 1726882582.75512: Calling all_inventory to load vars for managed_node1 19110 1726882582.75514: Calling groups_inventory to load vars for managed_node1 19110 1726882582.75516: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.75521: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.75523: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.75526: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.76853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.80371: done with get_vars() 19110 1726882582.80400: done queuing things up, now waiting for results queue to drain 19110 1726882582.80402: results queue empty 19110 1726882582.80403: checking for any_errors_fatal 19110 1726882582.80404: done checking for any_errors_fatal 19110 1726882582.80405: checking for max_fail_percentage 19110 1726882582.80406: done checking for max_fail_percentage 19110 1726882582.80407: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.80408: done checking to see if all hosts have failed 19110 1726882582.80409: getting the remaining hosts for this loop 19110 1726882582.80409: done getting the remaining hosts for this loop 19110 1726882582.80412: getting the next task for host managed_node1 19110 1726882582.80415: done getting next task for host managed_node1 19110 1726882582.80416: ^ task is: None 19110 1726882582.80417: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.80419: done queuing things up, now waiting for results queue to drain 19110 1726882582.80419: results queue empty 19110 1726882582.80420: checking for any_errors_fatal 19110 1726882582.80421: done checking for any_errors_fatal 19110 1726882582.80421: checking for max_fail_percentage 19110 1726882582.80422: done checking for max_fail_percentage 19110 1726882582.80423: checking to see if all hosts have failed and the running result is not ok 19110 1726882582.80424: done checking to see if all hosts have failed 19110 1726882582.80424: getting the next task for host managed_node1 19110 1726882582.80427: done getting next task for host managed_node1 19110 1726882582.80427: ^ task is: None 19110 1726882582.80428: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.80569: in VariableManager get_vars() 19110 1726882582.80585: done with get_vars() 19110 1726882582.80590: in VariableManager get_vars() 19110 1726882582.80599: done with get_vars() 19110 1726882582.80603: variable 'omit' from source: magic vars 19110 1726882582.80631: in VariableManager get_vars() 19110 1726882582.80639: done with get_vars() 19110 1726882582.80659: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 19110 1726882582.81243: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882582.81744: getting the remaining hosts for this loop 19110 1726882582.81745: done getting the remaining hosts for this loop 19110 1726882582.81748: getting the next task for host managed_node1 19110 1726882582.81751: done getting next task for host managed_node1 19110 1726882582.81753: ^ task is: TASK: Gathering Facts 19110 1726882582.81754: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882582.81756: getting variables 19110 1726882582.81757: in VariableManager get_vars() 19110 1726882582.81767: Calling all_inventory to load vars for managed_node1 19110 1726882582.81769: Calling groups_inventory to load vars for managed_node1 19110 1726882582.81771: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882582.81776: Calling all_plugins_play to load vars for managed_node1 19110 1726882582.81778: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882582.81781: Calling groups_plugins_play to load vars for managed_node1 19110 1726882582.83890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882582.87522: done with get_vars() 19110 1726882582.87550: done getting variables 19110 1726882582.87599: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 21:36:22 -0400 (0:00:00.746) 0:00:39.734 ****** 19110 1726882582.87739: entering _queue_task() for managed_node1/gather_facts 19110 1726882582.88403: worker is 1 (out of 1 available) 19110 1726882582.88415: exiting _queue_task() for managed_node1/gather_facts 19110 1726882582.88427: done queuing things up, now waiting for results queue to drain 19110 1726882582.88428: waiting for pending results... 19110 1726882582.90243: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882582.90382: in run() - task 0e448fcc-3ce9-5372-c19a-0000000004e4 19110 1726882582.90488: variable 'ansible_search_path' from source: unknown 19110 1726882582.90553: calling self._execute() 19110 1726882582.90784: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.90850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.90870: variable 'omit' from source: magic vars 19110 1726882582.91586: variable 'ansible_distribution_major_version' from source: facts 19110 1726882582.91721: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882582.91734: variable 'omit' from source: magic vars 19110 1726882582.91769: variable 'omit' from source: magic vars 19110 1726882582.91851: variable 'omit' from source: magic vars 19110 1726882582.91967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882582.92072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882582.92098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882582.92162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.92267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882582.92301: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882582.92310: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.92319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.92533: Set connection var ansible_timeout to 10 19110 1726882582.92552: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882582.92571: Set connection var ansible_shell_executable to /bin/sh 19110 1726882582.92580: Set connection var ansible_shell_type to sh 19110 1726882582.92586: Set connection var ansible_connection to ssh 19110 1726882582.92690: Set connection var ansible_pipelining to False 19110 1726882582.92717: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.92724: variable 'ansible_connection' from source: unknown 19110 1726882582.92732: variable 'ansible_module_compression' from source: unknown 19110 1726882582.92739: variable 'ansible_shell_type' from source: unknown 19110 1726882582.92746: variable 'ansible_shell_executable' from source: unknown 19110 1726882582.92752: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882582.92764: variable 'ansible_pipelining' from source: unknown 19110 1726882582.92773: variable 'ansible_timeout' from source: unknown 19110 1726882582.92782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882582.93104: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882582.93236: variable 'omit' from source: magic vars 19110 1726882582.93247: starting attempt loop 19110 1726882582.93257: running the handler 19110 1726882582.93280: variable 'ansible_facts' from source: unknown 19110 1726882582.93300: _low_level_execute_command(): starting 19110 1726882582.93312: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882582.95258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.95265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.95302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.95306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.95309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.95474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882582.95486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.95606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882582.97262: stdout chunk (state=3): >>>/root <<< 19110 1726882582.97366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882582.97444: stderr chunk (state=3): >>><<< 19110 1726882582.97448: stdout chunk (state=3): >>><<< 19110 1726882582.97573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882582.97577: _low_level_execute_command(): starting 19110 1726882582.97581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989 `" && echo ansible-tmp-1726882582.9747381-20885-162661657932989="` echo /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989 `" ) && sleep 0' 19110 1726882582.98880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882582.99017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.99032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.99049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.99095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.99119: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882582.99133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.99151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882582.99165: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882582.99177: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882582.99189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882582.99203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882582.99219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882582.99239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882582.99252: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882582.99268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882582.99346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882582.99475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882582.99493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882582.99619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882583.01477: stdout chunk (state=3): >>>ansible-tmp-1726882582.9747381-20885-162661657932989=/root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989 <<< 19110 1726882583.01723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882583.01827: stderr chunk (state=3): >>><<< 19110 1726882583.01831: stdout chunk (state=3): >>><<< 19110 1726882583.01872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.9747381-20885-162661657932989=/root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882583.02176: variable 'ansible_module_compression' from source: unknown 19110 1726882583.02179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882583.02181: variable 'ansible_facts' from source: unknown 19110 1726882583.02320: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/AnsiballZ_setup.py 19110 1726882583.02715: Sending initial data 19110 1726882583.02738: Sent initial data (154 bytes) 19110 1726882583.03745: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.03749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882583.03769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882583.03798: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882583.03801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.03803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.03805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882583.03807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.03873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882583.03876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882583.03880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882583.03977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882583.05705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882583.06325: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882583.06329: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp2xrcsrxp /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/AnsiballZ_setup.py <<< 19110 1726882583.06585: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882583.10137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882583.10378: stderr chunk (state=3): >>><<< 19110 1726882583.10381: stdout chunk (state=3): >>><<< 19110 1726882583.10383: done transferring module to remote 19110 1726882583.10385: _low_level_execute_command(): starting 19110 1726882583.10387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/ /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/AnsiballZ_setup.py && sleep 0' 19110 1726882583.10950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882583.10969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.10986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.11004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882583.11044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882583.11057: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882583.11075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.11097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882583.11108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882583.11120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882583.11133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.11147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.11171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882583.11184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882583.11197: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882583.11211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.11282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882583.11299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882583.11312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882583.11436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882583.13236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882583.13239: stdout chunk (state=3): >>><<< 19110 1726882583.13242: stderr chunk (state=3): >>><<< 19110 1726882583.13277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882583.13280: _low_level_execute_command(): starting 19110 1726882583.13347: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/AnsiballZ_setup.py && sleep 0' 19110 1726882583.14362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.14367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.14370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882583.14403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882583.14407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.14409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.14476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882583.14500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882583.14622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882583.65597: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munU<<< 19110 1726882583.65608: stdout chunk (state=3): >>>EZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "23", "epoch": "1726882583", "epoch_int": "1726882583", "date": "2024-09-20", "time": "21:36:23", "iso8601_micro": "2024-09-21T01:36:23.393120Z", "iso8601": "2024-09-21T01:36:23Z", "iso8601_basic": "20240920T213623393120", "iso8601_basic_short": "20240920T213623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.46, "5m": 0.41, "15m": 0.22}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "<<< 19110 1726882583.65676: stdout chunk (state=3): >>>ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239202304, "block_size": 4096, "block_total": 65519355, "block_available": 64511524, "block_used": 1007831, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"<<< 19110 1726882583.65684: stdout chunk (state=3): >>>], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882583.67333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882583.67336: stdout chunk (state=3): >>><<< 19110 1726882583.67361: stderr chunk (state=3): >>><<< 19110 1726882583.67494: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "23", "epoch": "1726882583", "epoch_int": "1726882583", "date": "2024-09-20", "time": "21:36:23", "iso8601_micro": "2024-09-21T01:36:23.393120Z", "iso8601": "2024-09-21T01:36:23Z", "iso8601_basic": "20240920T213623393120", "iso8601_basic_short": "20240920T213623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.46, "5m": 0.41, "15m": 0.22}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239202304, "block_size": 4096, "block_total": 65519355, "block_available": 64511524, "block_used": 1007831, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882583.68299: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882583.68327: _low_level_execute_command(): starting 19110 1726882583.68337: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.9747381-20885-162661657932989/ > /dev/null 2>&1 && sleep 0' 19110 1726882583.69472: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882583.69491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.69509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.69528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882583.69574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882583.69591: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882583.69610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.69629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882583.69642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882583.69674: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882583.69677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882583.69679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882583.69730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882583.69734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882583.69839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882583.71666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882583.71739: stderr chunk (state=3): >>><<< 19110 1726882583.71752: stdout chunk (state=3): >>><<< 19110 1726882583.71811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882583.71814: handler run complete 19110 1726882583.71968: variable 'ansible_facts' from source: unknown 19110 1726882583.72088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.72495: variable 'ansible_facts' from source: unknown 19110 1726882583.72626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.72795: attempt loop complete, returning result 19110 1726882583.72810: _execute() done 19110 1726882583.72818: dumping result to json 19110 1726882583.72871: done dumping result, returning 19110 1726882583.72889: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-0000000004e4] 19110 1726882583.72907: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004e4 ok: [managed_node1] 19110 1726882583.73788: no more pending results, returning what we have 19110 1726882583.73794: results queue empty 19110 1726882583.73796: checking for any_errors_fatal 19110 1726882583.73797: done checking for any_errors_fatal 19110 1726882583.73798: checking for max_fail_percentage 19110 1726882583.73799: done checking for max_fail_percentage 19110 1726882583.73800: checking to see if all hosts have failed and the running result is not ok 19110 1726882583.73801: done checking to see if all hosts have failed 19110 1726882583.73802: getting the remaining hosts for this loop 19110 1726882583.73803: done getting the remaining hosts for this loop 19110 1726882583.73807: getting the next task for host managed_node1 19110 1726882583.73814: done getting next task for host managed_node1 19110 1726882583.73818: ^ task is: TASK: meta (flush_handlers) 19110 1726882583.73821: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882583.73827: getting variables 19110 1726882583.73829: in VariableManager get_vars() 19110 1726882583.73867: Calling all_inventory to load vars for managed_node1 19110 1726882583.73870: Calling groups_inventory to load vars for managed_node1 19110 1726882583.73873: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.73886: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.73892: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.73897: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.74479: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004e4 19110 1726882583.74482: WORKER PROCESS EXITING 19110 1726882583.75526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.77135: done with get_vars() 19110 1726882583.77153: done getting variables 19110 1726882583.77243: in VariableManager get_vars() 19110 1726882583.77252: Calling all_inventory to load vars for managed_node1 19110 1726882583.77254: Calling groups_inventory to load vars for managed_node1 19110 1726882583.77259: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.77265: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.77270: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.77273: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.78713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.80480: done with get_vars() 19110 1726882583.80518: done queuing things up, now waiting for results queue to drain 19110 1726882583.80520: results queue empty 19110 1726882583.80524: checking for any_errors_fatal 19110 1726882583.80527: done checking for any_errors_fatal 19110 1726882583.80530: checking for max_fail_percentage 19110 1726882583.80531: done checking for max_fail_percentage 19110 1726882583.80533: checking to see if all hosts have failed and the running result is not ok 19110 1726882583.80534: done checking to see if all hosts have failed 19110 1726882583.80535: getting the remaining hosts for this loop 19110 1726882583.80535: done getting the remaining hosts for this loop 19110 1726882583.80540: getting the next task for host managed_node1 19110 1726882583.80546: done getting next task for host managed_node1 19110 1726882583.80549: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 19110 1726882583.80551: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882583.80553: getting variables 19110 1726882583.80554: in VariableManager get_vars() 19110 1726882583.80567: Calling all_inventory to load vars for managed_node1 19110 1726882583.80571: Calling groups_inventory to load vars for managed_node1 19110 1726882583.80574: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.80581: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.80585: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.80591: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.82789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.84213: done with get_vars() 19110 1726882583.84233: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 21:36:23 -0400 (0:00:00.965) 0:00:40.700 ****** 19110 1726882583.84330: entering _queue_task() for managed_node1/include_tasks 19110 1726882583.85203: worker is 1 (out of 1 available) 19110 1726882583.85218: exiting _queue_task() for managed_node1/include_tasks 19110 1726882583.85235: done queuing things up, now waiting for results queue to drain 19110 1726882583.85238: waiting for pending results... 19110 1726882583.85569: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 19110 1726882583.85688: in run() - task 0e448fcc-3ce9-5372-c19a-000000000074 19110 1726882583.85712: variable 'ansible_search_path' from source: unknown 19110 1726882583.85766: calling self._execute() 19110 1726882583.85873: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882583.85887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882583.85899: variable 'omit' from source: magic vars 19110 1726882583.86390: variable 'ansible_distribution_major_version' from source: facts 19110 1726882583.86416: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882583.86435: _execute() done 19110 1726882583.86451: dumping result to json 19110 1726882583.86466: done dumping result, returning 19110 1726882583.86484: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [0e448fcc-3ce9-5372-c19a-000000000074] 19110 1726882583.86503: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000074 19110 1726882583.86673: no more pending results, returning what we have 19110 1726882583.86680: in VariableManager get_vars() 19110 1726882583.86713: Calling all_inventory to load vars for managed_node1 19110 1726882583.86716: Calling groups_inventory to load vars for managed_node1 19110 1726882583.86720: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.86733: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.86736: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.86739: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.87882: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000074 19110 1726882583.87885: WORKER PROCESS EXITING 19110 1726882583.88466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.90271: done with get_vars() 19110 1726882583.90298: variable 'ansible_search_path' from source: unknown 19110 1726882583.90312: we have included files to process 19110 1726882583.90313: generating all_blocks data 19110 1726882583.90315: done generating all_blocks data 19110 1726882583.90316: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19110 1726882583.90317: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19110 1726882583.90319: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19110 1726882583.90485: in VariableManager get_vars() 19110 1726882583.90509: done with get_vars() 19110 1726882583.90622: done processing included file 19110 1726882583.90625: iterating over new_blocks loaded from include file 19110 1726882583.90626: in VariableManager get_vars() 19110 1726882583.90638: done with get_vars() 19110 1726882583.90639: filtering new block on tags 19110 1726882583.90655: done filtering new block on tags 19110 1726882583.90657: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 19110 1726882583.90662: extending task lists for all hosts with included blocks 19110 1726882583.90710: done extending task lists 19110 1726882583.90712: done processing included files 19110 1726882583.90712: results queue empty 19110 1726882583.90713: checking for any_errors_fatal 19110 1726882583.90715: done checking for any_errors_fatal 19110 1726882583.90715: checking for max_fail_percentage 19110 1726882583.90716: done checking for max_fail_percentage 19110 1726882583.90726: checking to see if all hosts have failed and the running result is not ok 19110 1726882583.90727: done checking to see if all hosts have failed 19110 1726882583.90728: getting the remaining hosts for this loop 19110 1726882583.90729: done getting the remaining hosts for this loop 19110 1726882583.90733: getting the next task for host managed_node1 19110 1726882583.90736: done getting next task for host managed_node1 19110 1726882583.90738: ^ task is: TASK: Include the task 'get_profile_stat.yml' 19110 1726882583.90741: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882583.90743: getting variables 19110 1726882583.90744: in VariableManager get_vars() 19110 1726882583.90752: Calling all_inventory to load vars for managed_node1 19110 1726882583.90755: Calling groups_inventory to load vars for managed_node1 19110 1726882583.90757: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.90762: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.90767: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.90770: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.92285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882583.94096: done with get_vars() 19110 1726882583.94161: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:36:23 -0400 (0:00:00.099) 0:00:40.799 ****** 19110 1726882583.94245: entering _queue_task() for managed_node1/include_tasks 19110 1726882583.94587: worker is 1 (out of 1 available) 19110 1726882583.94600: exiting _queue_task() for managed_node1/include_tasks 19110 1726882583.94613: done queuing things up, now waiting for results queue to drain 19110 1726882583.94615: waiting for pending results... 19110 1726882583.96115: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 19110 1726882583.96472: in run() - task 0e448fcc-3ce9-5372-c19a-0000000004f5 19110 1726882583.96504: variable 'ansible_search_path' from source: unknown 19110 1726882583.96512: variable 'ansible_search_path' from source: unknown 19110 1726882583.96628: calling self._execute() 19110 1726882583.96986: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882583.96998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882583.97014: variable 'omit' from source: magic vars 19110 1726882583.97839: variable 'ansible_distribution_major_version' from source: facts 19110 1726882583.97858: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882583.97871: _execute() done 19110 1726882583.97878: dumping result to json 19110 1726882583.97884: done dumping result, returning 19110 1726882583.97893: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-5372-c19a-0000000004f5] 19110 1726882583.97903: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004f5 19110 1726882583.98037: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004f5 19110 1726882583.98071: no more pending results, returning what we have 19110 1726882583.98077: in VariableManager get_vars() 19110 1726882583.98108: Calling all_inventory to load vars for managed_node1 19110 1726882583.98111: Calling groups_inventory to load vars for managed_node1 19110 1726882583.98114: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882583.98127: Calling all_plugins_play to load vars for managed_node1 19110 1726882583.98129: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882583.98132: Calling groups_plugins_play to load vars for managed_node1 19110 1726882583.98915: WORKER PROCESS EXITING 19110 1726882583.99989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882584.02146: done with get_vars() 19110 1726882584.02172: variable 'ansible_search_path' from source: unknown 19110 1726882584.02173: variable 'ansible_search_path' from source: unknown 19110 1726882584.02229: we have included files to process 19110 1726882584.02230: generating all_blocks data 19110 1726882584.02232: done generating all_blocks data 19110 1726882584.02233: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19110 1726882584.02234: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19110 1726882584.02237: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19110 1726882584.03699: done processing included file 19110 1726882584.03701: iterating over new_blocks loaded from include file 19110 1726882584.03702: in VariableManager get_vars() 19110 1726882584.03727: done with get_vars() 19110 1726882584.03729: filtering new block on tags 19110 1726882584.03762: done filtering new block on tags 19110 1726882584.03767: in VariableManager get_vars() 19110 1726882584.03781: done with get_vars() 19110 1726882584.03783: filtering new block on tags 19110 1726882584.03805: done filtering new block on tags 19110 1726882584.03807: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 19110 1726882584.03812: extending task lists for all hosts with included blocks 19110 1726882584.03935: done extending task lists 19110 1726882584.03940: done processing included files 19110 1726882584.03942: results queue empty 19110 1726882584.03942: checking for any_errors_fatal 19110 1726882584.03945: done checking for any_errors_fatal 19110 1726882584.03946: checking for max_fail_percentage 19110 1726882584.03947: done checking for max_fail_percentage 19110 1726882584.03948: checking to see if all hosts have failed and the running result is not ok 19110 1726882584.03949: done checking to see if all hosts have failed 19110 1726882584.03950: getting the remaining hosts for this loop 19110 1726882584.03951: done getting the remaining hosts for this loop 19110 1726882584.03954: getting the next task for host managed_node1 19110 1726882584.03961: done getting next task for host managed_node1 19110 1726882584.03964: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 19110 1726882584.03968: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882584.03970: getting variables 19110 1726882584.03971: in VariableManager get_vars() 19110 1726882584.04023: Calling all_inventory to load vars for managed_node1 19110 1726882584.04026: Calling groups_inventory to load vars for managed_node1 19110 1726882584.04029: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882584.04034: Calling all_plugins_play to load vars for managed_node1 19110 1726882584.04036: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882584.04039: Calling groups_plugins_play to load vars for managed_node1 19110 1726882584.05478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882584.07653: done with get_vars() 19110 1726882584.07680: done getting variables 19110 1726882584.07724: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:24 -0400 (0:00:00.135) 0:00:40.934 ****** 19110 1726882584.07753: entering _queue_task() for managed_node1/set_fact 19110 1726882584.08301: worker is 1 (out of 1 available) 19110 1726882584.08315: exiting _queue_task() for managed_node1/set_fact 19110 1726882584.08327: done queuing things up, now waiting for results queue to drain 19110 1726882584.08329: waiting for pending results... 19110 1726882584.09062: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 19110 1726882584.09183: in run() - task 0e448fcc-3ce9-5372-c19a-000000000502 19110 1726882584.09203: variable 'ansible_search_path' from source: unknown 19110 1726882584.09209: variable 'ansible_search_path' from source: unknown 19110 1726882584.09252: calling self._execute() 19110 1726882584.09346: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.09362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.09381: variable 'omit' from source: magic vars 19110 1726882584.09866: variable 'ansible_distribution_major_version' from source: facts 19110 1726882584.09893: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882584.09905: variable 'omit' from source: magic vars 19110 1726882584.09972: variable 'omit' from source: magic vars 19110 1726882584.10021: variable 'omit' from source: magic vars 19110 1726882584.10072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882584.10122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882584.10149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882584.10177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.10194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.10237: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882584.10246: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.10254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.10388: Set connection var ansible_timeout to 10 19110 1726882584.10407: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882584.10417: Set connection var ansible_shell_executable to /bin/sh 19110 1726882584.10431: Set connection var ansible_shell_type to sh 19110 1726882584.10437: Set connection var ansible_connection to ssh 19110 1726882584.10447: Set connection var ansible_pipelining to False 19110 1726882584.10478: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.10487: variable 'ansible_connection' from source: unknown 19110 1726882584.10495: variable 'ansible_module_compression' from source: unknown 19110 1726882584.10501: variable 'ansible_shell_type' from source: unknown 19110 1726882584.10508: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.10515: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.10522: variable 'ansible_pipelining' from source: unknown 19110 1726882584.10531: variable 'ansible_timeout' from source: unknown 19110 1726882584.10544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.10713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882584.10729: variable 'omit' from source: magic vars 19110 1726882584.10739: starting attempt loop 19110 1726882584.10746: running the handler 19110 1726882584.10776: handler run complete 19110 1726882584.10792: attempt loop complete, returning result 19110 1726882584.10799: _execute() done 19110 1726882584.10805: dumping result to json 19110 1726882584.10813: done dumping result, returning 19110 1726882584.10824: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-5372-c19a-000000000502] 19110 1726882584.10842: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000502 19110 1726882584.10971: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000502 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 19110 1726882584.11029: no more pending results, returning what we have 19110 1726882584.11034: results queue empty 19110 1726882584.11035: checking for any_errors_fatal 19110 1726882584.11037: done checking for any_errors_fatal 19110 1726882584.11038: checking for max_fail_percentage 19110 1726882584.11040: done checking for max_fail_percentage 19110 1726882584.11041: checking to see if all hosts have failed and the running result is not ok 19110 1726882584.11042: done checking to see if all hosts have failed 19110 1726882584.11043: getting the remaining hosts for this loop 19110 1726882584.11045: done getting the remaining hosts for this loop 19110 1726882584.11049: getting the next task for host managed_node1 19110 1726882584.11058: done getting next task for host managed_node1 19110 1726882584.11061: ^ task is: TASK: Stat profile file 19110 1726882584.11068: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882584.11075: getting variables 19110 1726882584.11078: in VariableManager get_vars() 19110 1726882584.11108: Calling all_inventory to load vars for managed_node1 19110 1726882584.11112: Calling groups_inventory to load vars for managed_node1 19110 1726882584.11115: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882584.11127: Calling all_plugins_play to load vars for managed_node1 19110 1726882584.11130: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882584.11134: Calling groups_plugins_play to load vars for managed_node1 19110 1726882584.12341: WORKER PROCESS EXITING 19110 1726882584.13281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882584.15389: done with get_vars() 19110 1726882584.15416: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:24 -0400 (0:00:00.077) 0:00:41.012 ****** 19110 1726882584.15525: entering _queue_task() for managed_node1/stat 19110 1726882584.15832: worker is 1 (out of 1 available) 19110 1726882584.15846: exiting _queue_task() for managed_node1/stat 19110 1726882584.15861: done queuing things up, now waiting for results queue to drain 19110 1726882584.15862: waiting for pending results... 19110 1726882584.16135: running TaskExecutor() for managed_node1/TASK: Stat profile file 19110 1726882584.16265: in run() - task 0e448fcc-3ce9-5372-c19a-000000000503 19110 1726882584.16291: variable 'ansible_search_path' from source: unknown 19110 1726882584.16301: variable 'ansible_search_path' from source: unknown 19110 1726882584.16340: calling self._execute() 19110 1726882584.16432: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.16442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.16454: variable 'omit' from source: magic vars 19110 1726882584.16858: variable 'ansible_distribution_major_version' from source: facts 19110 1726882584.16879: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882584.16890: variable 'omit' from source: magic vars 19110 1726882584.16943: variable 'omit' from source: magic vars 19110 1726882584.17045: variable 'profile' from source: include params 19110 1726882584.17056: variable 'interface' from source: set_fact 19110 1726882584.17129: variable 'interface' from source: set_fact 19110 1726882584.17162: variable 'omit' from source: magic vars 19110 1726882584.17216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882584.17266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882584.17297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882584.17320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.17336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.17383: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882584.17395: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.17402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.17538: Set connection var ansible_timeout to 10 19110 1726882584.17559: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882584.17572: Set connection var ansible_shell_executable to /bin/sh 19110 1726882584.17592: Set connection var ansible_shell_type to sh 19110 1726882584.17608: Set connection var ansible_connection to ssh 19110 1726882584.17622: Set connection var ansible_pipelining to False 19110 1726882584.17658: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.17670: variable 'ansible_connection' from source: unknown 19110 1726882584.17677: variable 'ansible_module_compression' from source: unknown 19110 1726882584.17684: variable 'ansible_shell_type' from source: unknown 19110 1726882584.17690: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.17702: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.17711: variable 'ansible_pipelining' from source: unknown 19110 1726882584.17719: variable 'ansible_timeout' from source: unknown 19110 1726882584.17729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.17980: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882584.17997: variable 'omit' from source: magic vars 19110 1726882584.18007: starting attempt loop 19110 1726882584.18014: running the handler 19110 1726882584.18036: _low_level_execute_command(): starting 19110 1726882584.18053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882584.19002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.19019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.19041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.19075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.19126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.19152: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.19172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.19190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.19202: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.19212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.19225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.19238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.19263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.19281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.19294: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.19310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.19414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.19437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.19452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.19600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.21262: stdout chunk (state=3): >>>/root <<< 19110 1726882584.21442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.21445: stdout chunk (state=3): >>><<< 19110 1726882584.21448: stderr chunk (state=3): >>><<< 19110 1726882584.21550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.21553: _low_level_execute_command(): starting 19110 1726882584.21559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183 `" && echo ansible-tmp-1726882584.2147071-20933-35637276166183="` echo /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183 `" ) && sleep 0' 19110 1726882584.22100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.22113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.22127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.22143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.22188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.22200: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.22212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.22228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.22240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.22250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.22268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.22282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.22296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.22307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.22317: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.22328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.22405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.22425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.22439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.22568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.24421: stdout chunk (state=3): >>>ansible-tmp-1726882584.2147071-20933-35637276166183=/root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183 <<< 19110 1726882584.24584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.24587: stderr chunk (state=3): >>><<< 19110 1726882584.24592: stdout chunk (state=3): >>><<< 19110 1726882584.24612: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882584.2147071-20933-35637276166183=/root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.24661: variable 'ansible_module_compression' from source: unknown 19110 1726882584.24738: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19110 1726882584.24781: variable 'ansible_facts' from source: unknown 19110 1726882584.24851: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/AnsiballZ_stat.py 19110 1726882584.25082: Sending initial data 19110 1726882584.25094: Sent initial data (152 bytes) 19110 1726882584.26002: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.26008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.26050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.26058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.26079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.26085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.26162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.26191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.26309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.28021: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882584.28109: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882584.28205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpoir71_zb /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/AnsiballZ_stat.py <<< 19110 1726882584.28297: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882584.29595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.29716: stderr chunk (state=3): >>><<< 19110 1726882584.29719: stdout chunk (state=3): >>><<< 19110 1726882584.29739: done transferring module to remote 19110 1726882584.29749: _low_level_execute_command(): starting 19110 1726882584.29754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/ /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/AnsiballZ_stat.py && sleep 0' 19110 1726882584.30350: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.30361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.30374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.30384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.30421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.30428: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.30437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.30450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.30460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.30463: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.30477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.30487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.30495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.30503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.30509: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.30518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.30601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.30610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.30617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.30736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.32494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.32515: stderr chunk (state=3): >>><<< 19110 1726882584.32519: stdout chunk (state=3): >>><<< 19110 1726882584.32611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.32615: _low_level_execute_command(): starting 19110 1726882584.32618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/AnsiballZ_stat.py && sleep 0' 19110 1726882584.33173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.33188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.33204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.33222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.33270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.33283: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.33298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.33316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.33329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.33341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.33354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.33375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.33391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.33404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.33416: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.33429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.33510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.33527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.33542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.33688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.46580: stdout chunk (state=3): >>> <<< 19110 1726882584.46587: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19110 1726882584.47530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882584.47595: stderr chunk (state=3): >>><<< 19110 1726882584.47599: stdout chunk (state=3): >>><<< 19110 1726882584.47616: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882584.47644: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882584.47654: _low_level_execute_command(): starting 19110 1726882584.47659: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882584.2147071-20933-35637276166183/ > /dev/null 2>&1 && sleep 0' 19110 1726882584.48245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.48254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.48267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.48282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.48319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.48327: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.48335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.48348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.48358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.48361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.48373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.48382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.48393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.48400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.48407: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.48416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.48488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.48498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.48512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.48650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.50501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.50504: stdout chunk (state=3): >>><<< 19110 1726882584.50506: stderr chunk (state=3): >>><<< 19110 1726882584.50569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.50574: handler run complete 19110 1726882584.50577: attempt loop complete, returning result 19110 1726882584.50579: _execute() done 19110 1726882584.50581: dumping result to json 19110 1726882584.50583: done dumping result, returning 19110 1726882584.50771: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-5372-c19a-000000000503] 19110 1726882584.50774: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000503 19110 1726882584.50847: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000503 19110 1726882584.50850: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 19110 1726882584.50936: no more pending results, returning what we have 19110 1726882584.50940: results queue empty 19110 1726882584.50942: checking for any_errors_fatal 19110 1726882584.50948: done checking for any_errors_fatal 19110 1726882584.50949: checking for max_fail_percentage 19110 1726882584.50951: done checking for max_fail_percentage 19110 1726882584.50952: checking to see if all hosts have failed and the running result is not ok 19110 1726882584.50953: done checking to see if all hosts have failed 19110 1726882584.50954: getting the remaining hosts for this loop 19110 1726882584.50958: done getting the remaining hosts for this loop 19110 1726882584.50963: getting the next task for host managed_node1 19110 1726882584.50972: done getting next task for host managed_node1 19110 1726882584.50975: ^ task is: TASK: Set NM profile exist flag based on the profile files 19110 1726882584.50980: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882584.50985: getting variables 19110 1726882584.50986: in VariableManager get_vars() 19110 1726882584.51018: Calling all_inventory to load vars for managed_node1 19110 1726882584.51021: Calling groups_inventory to load vars for managed_node1 19110 1726882584.51025: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882584.51037: Calling all_plugins_play to load vars for managed_node1 19110 1726882584.51039: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882584.51043: Calling groups_plugins_play to load vars for managed_node1 19110 1726882584.52687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882584.55543: done with get_vars() 19110 1726882584.55567: done getting variables 19110 1726882584.55672: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:24 -0400 (0:00:00.401) 0:00:41.414 ****** 19110 1726882584.55703: entering _queue_task() for managed_node1/set_fact 19110 1726882584.56033: worker is 1 (out of 1 available) 19110 1726882584.56044: exiting _queue_task() for managed_node1/set_fact 19110 1726882584.56055: done queuing things up, now waiting for results queue to drain 19110 1726882584.56057: waiting for pending results... 19110 1726882584.56346: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 19110 1726882584.56466: in run() - task 0e448fcc-3ce9-5372-c19a-000000000504 19110 1726882584.56487: variable 'ansible_search_path' from source: unknown 19110 1726882584.56499: variable 'ansible_search_path' from source: unknown 19110 1726882584.56544: calling self._execute() 19110 1726882584.56640: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.56651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.56668: variable 'omit' from source: magic vars 19110 1726882584.57075: variable 'ansible_distribution_major_version' from source: facts 19110 1726882584.57093: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882584.57224: variable 'profile_stat' from source: set_fact 19110 1726882584.57241: Evaluated conditional (profile_stat.stat.exists): False 19110 1726882584.57248: when evaluation is False, skipping this task 19110 1726882584.57262: _execute() done 19110 1726882584.57276: dumping result to json 19110 1726882584.57283: done dumping result, returning 19110 1726882584.57292: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-5372-c19a-000000000504] 19110 1726882584.57302: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000504 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19110 1726882584.57740: no more pending results, returning what we have 19110 1726882584.57745: results queue empty 19110 1726882584.57747: checking for any_errors_fatal 19110 1726882584.57756: done checking for any_errors_fatal 19110 1726882584.57757: checking for max_fail_percentage 19110 1726882584.57758: done checking for max_fail_percentage 19110 1726882584.57759: checking to see if all hosts have failed and the running result is not ok 19110 1726882584.57760: done checking to see if all hosts have failed 19110 1726882584.57760: getting the remaining hosts for this loop 19110 1726882584.57762: done getting the remaining hosts for this loop 19110 1726882584.57768: getting the next task for host managed_node1 19110 1726882584.57776: done getting next task for host managed_node1 19110 1726882584.57779: ^ task is: TASK: Get NM profile info 19110 1726882584.57783: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882584.57787: getting variables 19110 1726882584.57789: in VariableManager get_vars() 19110 1726882584.57818: Calling all_inventory to load vars for managed_node1 19110 1726882584.57821: Calling groups_inventory to load vars for managed_node1 19110 1726882584.57825: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882584.57837: Calling all_plugins_play to load vars for managed_node1 19110 1726882584.57839: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882584.57842: Calling groups_plugins_play to load vars for managed_node1 19110 1726882584.58936: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000504 19110 1726882584.58940: WORKER PROCESS EXITING 19110 1726882584.59862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882584.61704: done with get_vars() 19110 1726882584.61731: done getting variables 19110 1726882584.61859: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:24 -0400 (0:00:00.061) 0:00:41.476 ****** 19110 1726882584.61897: entering _queue_task() for managed_node1/shell 19110 1726882584.61899: Creating lock for shell 19110 1726882584.62266: worker is 1 (out of 1 available) 19110 1726882584.62279: exiting _queue_task() for managed_node1/shell 19110 1726882584.62291: done queuing things up, now waiting for results queue to drain 19110 1726882584.62293: waiting for pending results... 19110 1726882584.63636: running TaskExecutor() for managed_node1/TASK: Get NM profile info 19110 1726882584.63747: in run() - task 0e448fcc-3ce9-5372-c19a-000000000505 19110 1726882584.63766: variable 'ansible_search_path' from source: unknown 19110 1726882584.63770: variable 'ansible_search_path' from source: unknown 19110 1726882584.63801: calling self._execute() 19110 1726882584.63887: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.63902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.63906: variable 'omit' from source: magic vars 19110 1726882584.64267: variable 'ansible_distribution_major_version' from source: facts 19110 1726882584.64284: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882584.64287: variable 'omit' from source: magic vars 19110 1726882584.64330: variable 'omit' from source: magic vars 19110 1726882584.64434: variable 'profile' from source: include params 19110 1726882584.64438: variable 'interface' from source: set_fact 19110 1726882584.64516: variable 'interface' from source: set_fact 19110 1726882584.64530: variable 'omit' from source: magic vars 19110 1726882584.64568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882584.64593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882584.64613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882584.64627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.64636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882584.64672: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882584.64675: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.64677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.64747: Set connection var ansible_timeout to 10 19110 1726882584.64762: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882584.64774: Set connection var ansible_shell_executable to /bin/sh 19110 1726882584.64777: Set connection var ansible_shell_type to sh 19110 1726882584.64779: Set connection var ansible_connection to ssh 19110 1726882584.64784: Set connection var ansible_pipelining to False 19110 1726882584.64840: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.64843: variable 'ansible_connection' from source: unknown 19110 1726882584.64845: variable 'ansible_module_compression' from source: unknown 19110 1726882584.64847: variable 'ansible_shell_type' from source: unknown 19110 1726882584.64850: variable 'ansible_shell_executable' from source: unknown 19110 1726882584.64852: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882584.64854: variable 'ansible_pipelining' from source: unknown 19110 1726882584.64856: variable 'ansible_timeout' from source: unknown 19110 1726882584.64858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882584.64977: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882584.64994: variable 'omit' from source: magic vars 19110 1726882584.65004: starting attempt loop 19110 1726882584.65011: running the handler 19110 1726882584.65025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882584.65048: _low_level_execute_command(): starting 19110 1726882584.65072: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882584.65852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.66018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.66089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.67690: stdout chunk (state=3): >>>/root <<< 19110 1726882584.67793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.67844: stderr chunk (state=3): >>><<< 19110 1726882584.67852: stdout chunk (state=3): >>><<< 19110 1726882584.67940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.67944: _low_level_execute_command(): starting 19110 1726882584.67948: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669 `" && echo ansible-tmp-1726882584.6787686-20955-227006553957669="` echo /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669 `" ) && sleep 0' 19110 1726882584.68407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.68411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.68429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.68435: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.68444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.68454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.68459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.68476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.68487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.68489: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.68497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.68543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.68559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.68567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.68678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.70530: stdout chunk (state=3): >>>ansible-tmp-1726882584.6787686-20955-227006553957669=/root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669 <<< 19110 1726882584.70705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.70709: stderr chunk (state=3): >>><<< 19110 1726882584.70711: stdout chunk (state=3): >>><<< 19110 1726882584.70784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882584.6787686-20955-227006553957669=/root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.70792: variable 'ansible_module_compression' from source: unknown 19110 1726882584.70832: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882584.70872: variable 'ansible_facts' from source: unknown 19110 1726882584.70947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/AnsiballZ_command.py 19110 1726882584.71113: Sending initial data 19110 1726882584.71116: Sent initial data (156 bytes) 19110 1726882584.72006: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.72015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.72027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.72044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.72081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.72089: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.72099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.72112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.72119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.72127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.72133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.72143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.72153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.72160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.72170: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.72177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.72248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.72269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.72277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.72399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.74095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19110 1726882584.74100: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882584.74204: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882584.74291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpq4pit4nw /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/AnsiballZ_command.py <<< 19110 1726882584.74380: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882584.75969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.75973: stderr chunk (state=3): >>><<< 19110 1726882584.75976: stdout chunk (state=3): >>><<< 19110 1726882584.75978: done transferring module to remote 19110 1726882584.75981: _low_level_execute_command(): starting 19110 1726882584.75983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/ /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/AnsiballZ_command.py && sleep 0' 19110 1726882584.76748: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.76782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.76811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.76846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.76897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.76937: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.76951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.76955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.76964: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.76990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.77015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.77041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.77061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.77086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.77099: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.77104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.77226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.77245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.77265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.77389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.79121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.79169: stderr chunk (state=3): >>><<< 19110 1726882584.79174: stdout chunk (state=3): >>><<< 19110 1726882584.79176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.79180: _low_level_execute_command(): starting 19110 1726882584.79189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/AnsiballZ_command.py && sleep 0' 19110 1726882584.79793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.79796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.79831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.79835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.79885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.79889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.79996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.94601: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:36:24.927350", "end": "2024-09-20 21:36:24.944481", "delta": "0:00:00.017131", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882584.95779: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 19110 1726882584.95783: stdout chunk (state=3): >>><<< 19110 1726882584.95785: stderr chunk (state=3): >>><<< 19110 1726882584.95926: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:36:24.927350", "end": "2024-09-20 21:36:24.944481", "delta": "0:00:00.017131", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 19110 1726882584.95930: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882584.95940: _low_level_execute_command(): starting 19110 1726882584.95943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882584.6787686-20955-227006553957669/ > /dev/null 2>&1 && sleep 0' 19110 1726882584.96491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882584.96504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.96515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.96530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.96569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.96581: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882584.96593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.96608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882584.96617: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882584.96625: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882584.96634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882584.96644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882584.96656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882584.96670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882584.96681: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882584.96693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882584.96760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882584.96779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882584.96794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882584.96983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882584.98777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882584.98849: stderr chunk (state=3): >>><<< 19110 1726882584.98861: stdout chunk (state=3): >>><<< 19110 1726882584.99170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882584.99174: handler run complete 19110 1726882584.99176: Evaluated conditional (False): False 19110 1726882584.99179: attempt loop complete, returning result 19110 1726882584.99181: _execute() done 19110 1726882584.99183: dumping result to json 19110 1726882584.99184: done dumping result, returning 19110 1726882584.99186: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-5372-c19a-000000000505] 19110 1726882584.99188: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000505 19110 1726882584.99271: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000505 19110 1726882584.99276: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.017131", "end": "2024-09-20 21:36:24.944481", "rc": 1, "start": "2024-09-20 21:36:24.927350" } MSG: non-zero return code ...ignoring 19110 1726882584.99357: no more pending results, returning what we have 19110 1726882584.99361: results queue empty 19110 1726882584.99362: checking for any_errors_fatal 19110 1726882584.99368: done checking for any_errors_fatal 19110 1726882584.99369: checking for max_fail_percentage 19110 1726882584.99371: done checking for max_fail_percentage 19110 1726882584.99372: checking to see if all hosts have failed and the running result is not ok 19110 1726882584.99373: done checking to see if all hosts have failed 19110 1726882584.99375: getting the remaining hosts for this loop 19110 1726882584.99377: done getting the remaining hosts for this loop 19110 1726882584.99381: getting the next task for host managed_node1 19110 1726882584.99388: done getting next task for host managed_node1 19110 1726882584.99391: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19110 1726882584.99395: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882584.99400: getting variables 19110 1726882584.99401: in VariableManager get_vars() 19110 1726882584.99432: Calling all_inventory to load vars for managed_node1 19110 1726882584.99435: Calling groups_inventory to load vars for managed_node1 19110 1726882584.99439: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882584.99450: Calling all_plugins_play to load vars for managed_node1 19110 1726882584.99453: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882584.99455: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.01337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.03226: done with get_vars() 19110 1726882585.03250: done getting variables 19110 1726882585.03316: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:25 -0400 (0:00:00.414) 0:00:41.890 ****** 19110 1726882585.03353: entering _queue_task() for managed_node1/set_fact 19110 1726882585.04770: worker is 1 (out of 1 available) 19110 1726882585.04784: exiting _queue_task() for managed_node1/set_fact 19110 1726882585.04796: done queuing things up, now waiting for results queue to drain 19110 1726882585.04798: waiting for pending results... 19110 1726882585.05712: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19110 1726882585.05811: in run() - task 0e448fcc-3ce9-5372-c19a-000000000506 19110 1726882585.05825: variable 'ansible_search_path' from source: unknown 19110 1726882585.05828: variable 'ansible_search_path' from source: unknown 19110 1726882585.05865: calling self._execute() 19110 1726882585.05940: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.05945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.05954: variable 'omit' from source: magic vars 19110 1726882585.06308: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.06322: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.06447: variable 'nm_profile_exists' from source: set_fact 19110 1726882585.06460: Evaluated conditional (nm_profile_exists.rc == 0): False 19110 1726882585.06465: when evaluation is False, skipping this task 19110 1726882585.06468: _execute() done 19110 1726882585.06470: dumping result to json 19110 1726882585.06473: done dumping result, returning 19110 1726882585.06480: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-5372-c19a-000000000506] 19110 1726882585.06487: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000506 19110 1726882585.06576: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000506 19110 1726882585.06579: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 19110 1726882585.06647: no more pending results, returning what we have 19110 1726882585.06651: results queue empty 19110 1726882585.06652: checking for any_errors_fatal 19110 1726882585.06666: done checking for any_errors_fatal 19110 1726882585.06666: checking for max_fail_percentage 19110 1726882585.06668: done checking for max_fail_percentage 19110 1726882585.06669: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.06670: done checking to see if all hosts have failed 19110 1726882585.06670: getting the remaining hosts for this loop 19110 1726882585.06672: done getting the remaining hosts for this loop 19110 1726882585.06675: getting the next task for host managed_node1 19110 1726882585.06684: done getting next task for host managed_node1 19110 1726882585.06687: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 19110 1726882585.06691: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.06694: getting variables 19110 1726882585.06696: in VariableManager get_vars() 19110 1726882585.06720: Calling all_inventory to load vars for managed_node1 19110 1726882585.06723: Calling groups_inventory to load vars for managed_node1 19110 1726882585.06726: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.06735: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.06738: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.06740: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.08246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.10177: done with get_vars() 19110 1726882585.10199: done getting variables 19110 1726882585.10276: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.10404: variable 'profile' from source: include params 19110 1726882585.10408: variable 'interface' from source: set_fact 19110 1726882585.10481: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:25 -0400 (0:00:00.071) 0:00:41.962 ****** 19110 1726882585.10512: entering _queue_task() for managed_node1/command 19110 1726882585.10833: worker is 1 (out of 1 available) 19110 1726882585.10846: exiting _queue_task() for managed_node1/command 19110 1726882585.10861: done queuing things up, now waiting for results queue to drain 19110 1726882585.10863: waiting for pending results... 19110 1726882585.11166: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 19110 1726882585.11302: in run() - task 0e448fcc-3ce9-5372-c19a-000000000508 19110 1726882585.11328: variable 'ansible_search_path' from source: unknown 19110 1726882585.11340: variable 'ansible_search_path' from source: unknown 19110 1726882585.11386: calling self._execute() 19110 1726882585.11484: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.11496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.11511: variable 'omit' from source: magic vars 19110 1726882585.11911: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.11929: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.12067: variable 'profile_stat' from source: set_fact 19110 1726882585.12095: Evaluated conditional (profile_stat.stat.exists): False 19110 1726882585.12102: when evaluation is False, skipping this task 19110 1726882585.12108: _execute() done 19110 1726882585.12114: dumping result to json 19110 1726882585.12121: done dumping result, returning 19110 1726882585.12129: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0e448fcc-3ce9-5372-c19a-000000000508] 19110 1726882585.12139: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000508 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19110 1726882585.12288: no more pending results, returning what we have 19110 1726882585.12293: results queue empty 19110 1726882585.12294: checking for any_errors_fatal 19110 1726882585.12301: done checking for any_errors_fatal 19110 1726882585.12302: checking for max_fail_percentage 19110 1726882585.12304: done checking for max_fail_percentage 19110 1726882585.12305: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.12306: done checking to see if all hosts have failed 19110 1726882585.12306: getting the remaining hosts for this loop 19110 1726882585.12308: done getting the remaining hosts for this loop 19110 1726882585.12312: getting the next task for host managed_node1 19110 1726882585.12319: done getting next task for host managed_node1 19110 1726882585.12322: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 19110 1726882585.12327: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.12331: getting variables 19110 1726882585.12333: in VariableManager get_vars() 19110 1726882585.12371: Calling all_inventory to load vars for managed_node1 19110 1726882585.12374: Calling groups_inventory to load vars for managed_node1 19110 1726882585.12379: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.12393: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.12397: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.12400: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.13433: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000508 19110 1726882585.13437: WORKER PROCESS EXITING 19110 1726882585.14407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.16295: done with get_vars() 19110 1726882585.16318: done getting variables 19110 1726882585.16385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.16506: variable 'profile' from source: include params 19110 1726882585.16510: variable 'interface' from source: set_fact 19110 1726882585.16568: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:25 -0400 (0:00:00.060) 0:00:42.023 ****** 19110 1726882585.16598: entering _queue_task() for managed_node1/set_fact 19110 1726882585.16891: worker is 1 (out of 1 available) 19110 1726882585.16904: exiting _queue_task() for managed_node1/set_fact 19110 1726882585.16916: done queuing things up, now waiting for results queue to drain 19110 1726882585.16918: waiting for pending results... 19110 1726882585.17202: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 19110 1726882585.17324: in run() - task 0e448fcc-3ce9-5372-c19a-000000000509 19110 1726882585.17342: variable 'ansible_search_path' from source: unknown 19110 1726882585.17348: variable 'ansible_search_path' from source: unknown 19110 1726882585.17397: calling self._execute() 19110 1726882585.17496: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.17507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.17520: variable 'omit' from source: magic vars 19110 1726882585.17898: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.17923: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.18060: variable 'profile_stat' from source: set_fact 19110 1726882585.18081: Evaluated conditional (profile_stat.stat.exists): False 19110 1726882585.18089: when evaluation is False, skipping this task 19110 1726882585.18096: _execute() done 19110 1726882585.18102: dumping result to json 19110 1726882585.18110: done dumping result, returning 19110 1726882585.18123: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0e448fcc-3ce9-5372-c19a-000000000509] 19110 1726882585.18141: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000509 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19110 1726882585.18285: no more pending results, returning what we have 19110 1726882585.18290: results queue empty 19110 1726882585.18291: checking for any_errors_fatal 19110 1726882585.18297: done checking for any_errors_fatal 19110 1726882585.18298: checking for max_fail_percentage 19110 1726882585.18299: done checking for max_fail_percentage 19110 1726882585.18300: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.18301: done checking to see if all hosts have failed 19110 1726882585.18302: getting the remaining hosts for this loop 19110 1726882585.18304: done getting the remaining hosts for this loop 19110 1726882585.18307: getting the next task for host managed_node1 19110 1726882585.18314: done getting next task for host managed_node1 19110 1726882585.18317: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 19110 1726882585.18322: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.18327: getting variables 19110 1726882585.18329: in VariableManager get_vars() 19110 1726882585.18359: Calling all_inventory to load vars for managed_node1 19110 1726882585.18363: Calling groups_inventory to load vars for managed_node1 19110 1726882585.18374: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.18388: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.18391: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.18395: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.19429: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000509 19110 1726882585.19432: WORKER PROCESS EXITING 19110 1726882585.23828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.25527: done with get_vars() 19110 1726882585.25549: done getting variables 19110 1726882585.25599: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.25695: variable 'profile' from source: include params 19110 1726882585.25700: variable 'interface' from source: set_fact 19110 1726882585.25743: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:25 -0400 (0:00:00.091) 0:00:42.114 ****** 19110 1726882585.25767: entering _queue_task() for managed_node1/command 19110 1726882585.26023: worker is 1 (out of 1 available) 19110 1726882585.26036: exiting _queue_task() for managed_node1/command 19110 1726882585.26046: done queuing things up, now waiting for results queue to drain 19110 1726882585.26048: waiting for pending results... 19110 1726882585.26276: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 19110 1726882585.26411: in run() - task 0e448fcc-3ce9-5372-c19a-00000000050a 19110 1726882585.26431: variable 'ansible_search_path' from source: unknown 19110 1726882585.26439: variable 'ansible_search_path' from source: unknown 19110 1726882585.26483: calling self._execute() 19110 1726882585.26587: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.26600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.26616: variable 'omit' from source: magic vars 19110 1726882585.26990: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.27008: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.27132: variable 'profile_stat' from source: set_fact 19110 1726882585.27152: Evaluated conditional (profile_stat.stat.exists): False 19110 1726882585.27166: when evaluation is False, skipping this task 19110 1726882585.27177: _execute() done 19110 1726882585.27187: dumping result to json 19110 1726882585.27196: done dumping result, returning 19110 1726882585.27206: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 [0e448fcc-3ce9-5372-c19a-00000000050a] 19110 1726882585.27218: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000050a skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19110 1726882585.27369: no more pending results, returning what we have 19110 1726882585.27374: results queue empty 19110 1726882585.27375: checking for any_errors_fatal 19110 1726882585.27389: done checking for any_errors_fatal 19110 1726882585.27390: checking for max_fail_percentage 19110 1726882585.27392: done checking for max_fail_percentage 19110 1726882585.27393: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.27393: done checking to see if all hosts have failed 19110 1726882585.27394: getting the remaining hosts for this loop 19110 1726882585.27396: done getting the remaining hosts for this loop 19110 1726882585.27404: getting the next task for host managed_node1 19110 1726882585.27411: done getting next task for host managed_node1 19110 1726882585.27419: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 19110 1726882585.27425: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.27430: getting variables 19110 1726882585.27432: in VariableManager get_vars() 19110 1726882585.27463: Calling all_inventory to load vars for managed_node1 19110 1726882585.27468: Calling groups_inventory to load vars for managed_node1 19110 1726882585.27471: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.27484: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.27500: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.27506: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.28103: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000050a 19110 1726882585.28107: WORKER PROCESS EXITING 19110 1726882585.28411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.29406: done with get_vars() 19110 1726882585.29426: done getting variables 19110 1726882585.29484: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.29583: variable 'profile' from source: include params 19110 1726882585.29587: variable 'interface' from source: set_fact 19110 1726882585.29641: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:25 -0400 (0:00:00.039) 0:00:42.154 ****** 19110 1726882585.29674: entering _queue_task() for managed_node1/set_fact 19110 1726882585.29934: worker is 1 (out of 1 available) 19110 1726882585.29946: exiting _queue_task() for managed_node1/set_fact 19110 1726882585.29961: done queuing things up, now waiting for results queue to drain 19110 1726882585.29962: waiting for pending results... 19110 1726882585.30233: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 19110 1726882585.30357: in run() - task 0e448fcc-3ce9-5372-c19a-00000000050b 19110 1726882585.30375: variable 'ansible_search_path' from source: unknown 19110 1726882585.30378: variable 'ansible_search_path' from source: unknown 19110 1726882585.30414: calling self._execute() 19110 1726882585.30501: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.30505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.30519: variable 'omit' from source: magic vars 19110 1726882585.30843: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.30853: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.30936: variable 'profile_stat' from source: set_fact 19110 1726882585.30948: Evaluated conditional (profile_stat.stat.exists): False 19110 1726882585.30951: when evaluation is False, skipping this task 19110 1726882585.30954: _execute() done 19110 1726882585.30956: dumping result to json 19110 1726882585.30959: done dumping result, returning 19110 1726882585.30968: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0e448fcc-3ce9-5372-c19a-00000000050b] 19110 1726882585.30983: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000050b 19110 1726882585.31057: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000050b 19110 1726882585.31060: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19110 1726882585.31106: no more pending results, returning what we have 19110 1726882585.31110: results queue empty 19110 1726882585.31110: checking for any_errors_fatal 19110 1726882585.31117: done checking for any_errors_fatal 19110 1726882585.31118: checking for max_fail_percentage 19110 1726882585.31119: done checking for max_fail_percentage 19110 1726882585.31120: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.31121: done checking to see if all hosts have failed 19110 1726882585.31122: getting the remaining hosts for this loop 19110 1726882585.31123: done getting the remaining hosts for this loop 19110 1726882585.31126: getting the next task for host managed_node1 19110 1726882585.31132: done getting next task for host managed_node1 19110 1726882585.31135: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 19110 1726882585.31138: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.31141: getting variables 19110 1726882585.31143: in VariableManager get_vars() 19110 1726882585.31167: Calling all_inventory to load vars for managed_node1 19110 1726882585.31169: Calling groups_inventory to load vars for managed_node1 19110 1726882585.31172: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.31182: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.31184: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.31187: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.32081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.33315: done with get_vars() 19110 1726882585.33336: done getting variables 19110 1726882585.33412: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.33539: variable 'profile' from source: include params 19110 1726882585.33542: variable 'interface' from source: set_fact 19110 1726882585.33585: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:36:25 -0400 (0:00:00.039) 0:00:42.193 ****** 19110 1726882585.33621: entering _queue_task() for managed_node1/assert 19110 1726882585.33871: worker is 1 (out of 1 available) 19110 1726882585.33885: exiting _queue_task() for managed_node1/assert 19110 1726882585.33896: done queuing things up, now waiting for results queue to drain 19110 1726882585.33898: waiting for pending results... 19110 1726882585.34089: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' 19110 1726882585.34159: in run() - task 0e448fcc-3ce9-5372-c19a-0000000004f6 19110 1726882585.34169: variable 'ansible_search_path' from source: unknown 19110 1726882585.34172: variable 'ansible_search_path' from source: unknown 19110 1726882585.34229: calling self._execute() 19110 1726882585.34342: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.34354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.34366: variable 'omit' from source: magic vars 19110 1726882585.34826: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.34850: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.34857: variable 'omit' from source: magic vars 19110 1726882585.34900: variable 'omit' from source: magic vars 19110 1726882585.35015: variable 'profile' from source: include params 19110 1726882585.35020: variable 'interface' from source: set_fact 19110 1726882585.35101: variable 'interface' from source: set_fact 19110 1726882585.35129: variable 'omit' from source: magic vars 19110 1726882585.35186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882585.35235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882585.35261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882585.35295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.35307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.35341: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882585.35344: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.35346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.35474: Set connection var ansible_timeout to 10 19110 1726882585.35486: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882585.35495: Set connection var ansible_shell_executable to /bin/sh 19110 1726882585.35498: Set connection var ansible_shell_type to sh 19110 1726882585.35501: Set connection var ansible_connection to ssh 19110 1726882585.35505: Set connection var ansible_pipelining to False 19110 1726882585.35523: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.35525: variable 'ansible_connection' from source: unknown 19110 1726882585.35528: variable 'ansible_module_compression' from source: unknown 19110 1726882585.35530: variable 'ansible_shell_type' from source: unknown 19110 1726882585.35532: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.35534: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.35538: variable 'ansible_pipelining' from source: unknown 19110 1726882585.35541: variable 'ansible_timeout' from source: unknown 19110 1726882585.35545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.35725: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882585.35749: variable 'omit' from source: magic vars 19110 1726882585.35760: starting attempt loop 19110 1726882585.35772: running the handler 19110 1726882585.35918: variable 'lsr_net_profile_exists' from source: set_fact 19110 1726882585.35929: Evaluated conditional (not lsr_net_profile_exists): True 19110 1726882585.35939: handler run complete 19110 1726882585.35974: attempt loop complete, returning result 19110 1726882585.35982: _execute() done 19110 1726882585.35998: dumping result to json 19110 1726882585.36010: done dumping result, returning 19110 1726882585.36028: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' [0e448fcc-3ce9-5372-c19a-0000000004f6] 19110 1726882585.36039: sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004f6 19110 1726882585.36159: done sending task result for task 0e448fcc-3ce9-5372-c19a-0000000004f6 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 19110 1726882585.36232: no more pending results, returning what we have 19110 1726882585.36238: results queue empty 19110 1726882585.36250: checking for any_errors_fatal 19110 1726882585.36259: done checking for any_errors_fatal 19110 1726882585.36260: checking for max_fail_percentage 19110 1726882585.36262: done checking for max_fail_percentage 19110 1726882585.36265: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.36266: done checking to see if all hosts have failed 19110 1726882585.36266: getting the remaining hosts for this loop 19110 1726882585.36268: done getting the remaining hosts for this loop 19110 1726882585.36271: getting the next task for host managed_node1 19110 1726882585.36281: done getting next task for host managed_node1 19110 1726882585.36291: ^ task is: TASK: Include the task 'assert_device_absent.yml' 19110 1726882585.36293: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.36299: getting variables 19110 1726882585.36300: in VariableManager get_vars() 19110 1726882585.36345: Calling all_inventory to load vars for managed_node1 19110 1726882585.36350: Calling groups_inventory to load vars for managed_node1 19110 1726882585.36360: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.36370: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.36373: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.36378: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.36406: WORKER PROCESS EXITING 19110 1726882585.37527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.38502: done with get_vars() 19110 1726882585.38516: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 21:36:25 -0400 (0:00:00.049) 0:00:42.243 ****** 19110 1726882585.38578: entering _queue_task() for managed_node1/include_tasks 19110 1726882585.38761: worker is 1 (out of 1 available) 19110 1726882585.38777: exiting _queue_task() for managed_node1/include_tasks 19110 1726882585.38788: done queuing things up, now waiting for results queue to drain 19110 1726882585.38790: waiting for pending results... 19110 1726882585.38951: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 19110 1726882585.39024: in run() - task 0e448fcc-3ce9-5372-c19a-000000000075 19110 1726882585.39035: variable 'ansible_search_path' from source: unknown 19110 1726882585.39068: calling self._execute() 19110 1726882585.39137: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.39142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.39150: variable 'omit' from source: magic vars 19110 1726882585.39423: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.39433: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.39442: _execute() done 19110 1726882585.39445: dumping result to json 19110 1726882585.39449: done dumping result, returning 19110 1726882585.39452: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-5372-c19a-000000000075] 19110 1726882585.39466: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000075 19110 1726882585.39547: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000075 19110 1726882585.39550: WORKER PROCESS EXITING 19110 1726882585.39588: no more pending results, returning what we have 19110 1726882585.39593: in VariableManager get_vars() 19110 1726882585.39711: Calling all_inventory to load vars for managed_node1 19110 1726882585.39719: Calling groups_inventory to load vars for managed_node1 19110 1726882585.39724: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.39733: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.39736: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.39739: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.40976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.42285: done with get_vars() 19110 1726882585.42297: variable 'ansible_search_path' from source: unknown 19110 1726882585.42306: we have included files to process 19110 1726882585.42307: generating all_blocks data 19110 1726882585.42308: done generating all_blocks data 19110 1726882585.42312: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19110 1726882585.42312: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19110 1726882585.42314: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19110 1726882585.42419: in VariableManager get_vars() 19110 1726882585.42429: done with get_vars() 19110 1726882585.42503: done processing included file 19110 1726882585.42505: iterating over new_blocks loaded from include file 19110 1726882585.42506: in VariableManager get_vars() 19110 1726882585.42513: done with get_vars() 19110 1726882585.42513: filtering new block on tags 19110 1726882585.42524: done filtering new block on tags 19110 1726882585.42526: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 19110 1726882585.42529: extending task lists for all hosts with included blocks 19110 1726882585.42616: done extending task lists 19110 1726882585.42616: done processing included files 19110 1726882585.42617: results queue empty 19110 1726882585.42617: checking for any_errors_fatal 19110 1726882585.42620: done checking for any_errors_fatal 19110 1726882585.42620: checking for max_fail_percentage 19110 1726882585.42621: done checking for max_fail_percentage 19110 1726882585.42622: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.42622: done checking to see if all hosts have failed 19110 1726882585.42622: getting the remaining hosts for this loop 19110 1726882585.42623: done getting the remaining hosts for this loop 19110 1726882585.42625: getting the next task for host managed_node1 19110 1726882585.42627: done getting next task for host managed_node1 19110 1726882585.42628: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19110 1726882585.42630: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.42631: getting variables 19110 1726882585.42632: in VariableManager get_vars() 19110 1726882585.42637: Calling all_inventory to load vars for managed_node1 19110 1726882585.42639: Calling groups_inventory to load vars for managed_node1 19110 1726882585.42640: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.42644: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.42645: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.42647: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.43524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.44571: done with get_vars() 19110 1726882585.44595: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:25 -0400 (0:00:00.060) 0:00:42.304 ****** 19110 1726882585.44674: entering _queue_task() for managed_node1/include_tasks 19110 1726882585.44910: worker is 1 (out of 1 available) 19110 1726882585.44923: exiting _queue_task() for managed_node1/include_tasks 19110 1726882585.44936: done queuing things up, now waiting for results queue to drain 19110 1726882585.44937: waiting for pending results... 19110 1726882585.45145: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 19110 1726882585.45216: in run() - task 0e448fcc-3ce9-5372-c19a-00000000053c 19110 1726882585.45227: variable 'ansible_search_path' from source: unknown 19110 1726882585.45230: variable 'ansible_search_path' from source: unknown 19110 1726882585.45256: calling self._execute() 19110 1726882585.45356: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.45399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.45404: variable 'omit' from source: magic vars 19110 1726882585.45738: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.45749: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.45752: _execute() done 19110 1726882585.45755: dumping result to json 19110 1726882585.45774: done dumping result, returning 19110 1726882585.45778: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-5372-c19a-00000000053c] 19110 1726882585.45780: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000053c 19110 1726882585.45886: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000053c 19110 1726882585.45890: WORKER PROCESS EXITING 19110 1726882585.45974: no more pending results, returning what we have 19110 1726882585.45979: in VariableManager get_vars() 19110 1726882585.46012: Calling all_inventory to load vars for managed_node1 19110 1726882585.46015: Calling groups_inventory to load vars for managed_node1 19110 1726882585.46018: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.46032: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.46036: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.46043: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.48046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.49465: done with get_vars() 19110 1726882585.49481: variable 'ansible_search_path' from source: unknown 19110 1726882585.49482: variable 'ansible_search_path' from source: unknown 19110 1726882585.49506: we have included files to process 19110 1726882585.49507: generating all_blocks data 19110 1726882585.49508: done generating all_blocks data 19110 1726882585.49509: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882585.49510: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882585.49511: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19110 1726882585.49646: done processing included file 19110 1726882585.49648: iterating over new_blocks loaded from include file 19110 1726882585.49649: in VariableManager get_vars() 19110 1726882585.49659: done with get_vars() 19110 1726882585.49660: filtering new block on tags 19110 1726882585.49672: done filtering new block on tags 19110 1726882585.49674: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 19110 1726882585.49678: extending task lists for all hosts with included blocks 19110 1726882585.49736: done extending task lists 19110 1726882585.49737: done processing included files 19110 1726882585.49737: results queue empty 19110 1726882585.49738: checking for any_errors_fatal 19110 1726882585.49740: done checking for any_errors_fatal 19110 1726882585.49740: checking for max_fail_percentage 19110 1726882585.49741: done checking for max_fail_percentage 19110 1726882585.49742: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.49742: done checking to see if all hosts have failed 19110 1726882585.49743: getting the remaining hosts for this loop 19110 1726882585.49743: done getting the remaining hosts for this loop 19110 1726882585.49745: getting the next task for host managed_node1 19110 1726882585.49747: done getting next task for host managed_node1 19110 1726882585.49749: ^ task is: TASK: Get stat for interface {{ interface }} 19110 1726882585.49751: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.49752: getting variables 19110 1726882585.49753: in VariableManager get_vars() 19110 1726882585.49761: Calling all_inventory to load vars for managed_node1 19110 1726882585.49762: Calling groups_inventory to load vars for managed_node1 19110 1726882585.49765: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.49769: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.49770: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.49772: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.50727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.51879: done with get_vars() 19110 1726882585.51900: done getting variables 19110 1726882585.52014: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:25 -0400 (0:00:00.073) 0:00:42.377 ****** 19110 1726882585.52045: entering _queue_task() for managed_node1/stat 19110 1726882585.52309: worker is 1 (out of 1 available) 19110 1726882585.52321: exiting _queue_task() for managed_node1/stat 19110 1726882585.52339: done queuing things up, now waiting for results queue to drain 19110 1726882585.52341: waiting for pending results... 19110 1726882585.52607: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 19110 1726882585.52686: in run() - task 0e448fcc-3ce9-5372-c19a-000000000554 19110 1726882585.52697: variable 'ansible_search_path' from source: unknown 19110 1726882585.52700: variable 'ansible_search_path' from source: unknown 19110 1726882585.52739: calling self._execute() 19110 1726882585.52807: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.52811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.52821: variable 'omit' from source: magic vars 19110 1726882585.53167: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.53178: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.53183: variable 'omit' from source: magic vars 19110 1726882585.53216: variable 'omit' from source: magic vars 19110 1726882585.53296: variable 'interface' from source: set_fact 19110 1726882585.53308: variable 'omit' from source: magic vars 19110 1726882585.53342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882585.53403: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882585.53406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882585.53421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.53431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.53454: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882585.53460: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.53462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.53541: Set connection var ansible_timeout to 10 19110 1726882585.53551: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882585.53558: Set connection var ansible_shell_executable to /bin/sh 19110 1726882585.53560: Set connection var ansible_shell_type to sh 19110 1726882585.53563: Set connection var ansible_connection to ssh 19110 1726882585.53566: Set connection var ansible_pipelining to False 19110 1726882585.53601: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.53605: variable 'ansible_connection' from source: unknown 19110 1726882585.53607: variable 'ansible_module_compression' from source: unknown 19110 1726882585.53609: variable 'ansible_shell_type' from source: unknown 19110 1726882585.53622: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.53626: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.53628: variable 'ansible_pipelining' from source: unknown 19110 1726882585.53630: variable 'ansible_timeout' from source: unknown 19110 1726882585.53632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.53777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 19110 1726882585.53785: variable 'omit' from source: magic vars 19110 1726882585.53791: starting attempt loop 19110 1726882585.53794: running the handler 19110 1726882585.53805: _low_level_execute_command(): starting 19110 1726882585.53811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882585.54353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882585.54374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.54388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.54403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.54453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882585.54461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882585.54478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.54591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.56276: stdout chunk (state=3): >>>/root <<< 19110 1726882585.56370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882585.56422: stderr chunk (state=3): >>><<< 19110 1726882585.56425: stdout chunk (state=3): >>><<< 19110 1726882585.56445: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882585.56455: _low_level_execute_command(): starting 19110 1726882585.56469: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841 `" && echo ansible-tmp-1726882585.5644426-21003-73977811454841="` echo /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841 `" ) && sleep 0' 19110 1726882585.57078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.57100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.57201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 19110 1726882585.57240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882585.57246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.57333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.59219: stdout chunk (state=3): >>>ansible-tmp-1726882585.5644426-21003-73977811454841=/root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841 <<< 19110 1726882585.59317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882585.59367: stderr chunk (state=3): >>><<< 19110 1726882585.59370: stdout chunk (state=3): >>><<< 19110 1726882585.59383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882585.5644426-21003-73977811454841=/root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882585.59418: variable 'ansible_module_compression' from source: unknown 19110 1726882585.59470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19110 1726882585.59503: variable 'ansible_facts' from source: unknown 19110 1726882585.59558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/AnsiballZ_stat.py 19110 1726882585.59659: Sending initial data 19110 1726882585.59662: Sent initial data (152 bytes) 19110 1726882585.60326: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882585.60330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.60369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882585.60373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882585.60375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882585.60377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.60421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882585.60425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.60523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.62255: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882585.62345: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882585.62433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmph2vo7yuh /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/AnsiballZ_stat.py <<< 19110 1726882585.62526: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882585.63511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882585.63601: stderr chunk (state=3): >>><<< 19110 1726882585.63604: stdout chunk (state=3): >>><<< 19110 1726882585.63618: done transferring module to remote 19110 1726882585.63627: _low_level_execute_command(): starting 19110 1726882585.63631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/ /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/AnsiballZ_stat.py && sleep 0' 19110 1726882585.64045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882585.64051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.64084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.64096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.64151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882585.64160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.64272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.66112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882585.66153: stderr chunk (state=3): >>><<< 19110 1726882585.66157: stdout chunk (state=3): >>><<< 19110 1726882585.66183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882585.66190: _low_level_execute_command(): starting 19110 1726882585.66192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/AnsiballZ_stat.py && sleep 0' 19110 1726882585.66627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882585.66632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.66676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.66701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.66746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882585.66757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882585.66767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.66887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.79904: stdout chunk (state=3): >>> <<< 19110 1726882585.79908: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19110 1726882585.80888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882585.80935: stderr chunk (state=3): >>><<< 19110 1726882585.80938: stdout chunk (state=3): >>><<< 19110 1726882585.80956: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882585.80986: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882585.80993: _low_level_execute_command(): starting 19110 1726882585.80997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882585.5644426-21003-73977811454841/ > /dev/null 2>&1 && sleep 0' 19110 1726882585.81431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882585.81437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882585.81482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.81485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882585.81546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882585.81553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882585.81557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882585.81645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882585.83426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882585.83471: stderr chunk (state=3): >>><<< 19110 1726882585.83476: stdout chunk (state=3): >>><<< 19110 1726882585.83489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882585.83494: handler run complete 19110 1726882585.83511: attempt loop complete, returning result 19110 1726882585.83514: _execute() done 19110 1726882585.83516: dumping result to json 19110 1726882585.83523: done dumping result, returning 19110 1726882585.83526: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [0e448fcc-3ce9-5372-c19a-000000000554] 19110 1726882585.83532: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000554 19110 1726882585.83709: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000554 19110 1726882585.83712: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 19110 1726882585.83848: no more pending results, returning what we have 19110 1726882585.83871: results queue empty 19110 1726882585.83872: checking for any_errors_fatal 19110 1726882585.83874: done checking for any_errors_fatal 19110 1726882585.83886: checking for max_fail_percentage 19110 1726882585.83888: done checking for max_fail_percentage 19110 1726882585.83889: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.83890: done checking to see if all hosts have failed 19110 1726882585.83891: getting the remaining hosts for this loop 19110 1726882585.83893: done getting the remaining hosts for this loop 19110 1726882585.83896: getting the next task for host managed_node1 19110 1726882585.83915: done getting next task for host managed_node1 19110 1726882585.83948: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 19110 1726882585.83991: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.84019: getting variables 19110 1726882585.84028: in VariableManager get_vars() 19110 1726882585.84121: Calling all_inventory to load vars for managed_node1 19110 1726882585.84131: Calling groups_inventory to load vars for managed_node1 19110 1726882585.84146: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.84170: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.84173: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.84177: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.85512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.87302: done with get_vars() 19110 1726882585.87324: done getting variables 19110 1726882585.87386: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 19110 1726882585.87502: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:25 -0400 (0:00:00.354) 0:00:42.732 ****** 19110 1726882585.87532: entering _queue_task() for managed_node1/assert 19110 1726882585.87844: worker is 1 (out of 1 available) 19110 1726882585.87859: exiting _queue_task() for managed_node1/assert 19110 1726882585.87873: done queuing things up, now waiting for results queue to drain 19110 1726882585.87875: waiting for pending results... 19110 1726882585.88146: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' 19110 1726882585.88258: in run() - task 0e448fcc-3ce9-5372-c19a-00000000053d 19110 1726882585.88282: variable 'ansible_search_path' from source: unknown 19110 1726882585.88290: variable 'ansible_search_path' from source: unknown 19110 1726882585.88333: calling self._execute() 19110 1726882585.88425: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.88435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.88448: variable 'omit' from source: magic vars 19110 1726882585.88820: variable 'ansible_distribution_major_version' from source: facts 19110 1726882585.88838: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882585.88848: variable 'omit' from source: magic vars 19110 1726882585.88894: variable 'omit' from source: magic vars 19110 1726882585.89003: variable 'interface' from source: set_fact 19110 1726882585.89024: variable 'omit' from source: magic vars 19110 1726882585.89075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882585.89112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882585.89135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882585.89159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.89182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882585.89216: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882585.89224: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.89231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.89339: Set connection var ansible_timeout to 10 19110 1726882585.89358: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882585.89371: Set connection var ansible_shell_executable to /bin/sh 19110 1726882585.89378: Set connection var ansible_shell_type to sh 19110 1726882585.89384: Set connection var ansible_connection to ssh 19110 1726882585.89395: Set connection var ansible_pipelining to False 19110 1726882585.89422: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.89429: variable 'ansible_connection' from source: unknown 19110 1726882585.89436: variable 'ansible_module_compression' from source: unknown 19110 1726882585.89442: variable 'ansible_shell_type' from source: unknown 19110 1726882585.89449: variable 'ansible_shell_executable' from source: unknown 19110 1726882585.89458: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882585.89469: variable 'ansible_pipelining' from source: unknown 19110 1726882585.89476: variable 'ansible_timeout' from source: unknown 19110 1726882585.89485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882585.89640: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882585.89659: variable 'omit' from source: magic vars 19110 1726882585.89673: starting attempt loop 19110 1726882585.89681: running the handler 19110 1726882585.89827: variable 'interface_stat' from source: set_fact 19110 1726882585.89846: Evaluated conditional (not interface_stat.stat.exists): True 19110 1726882585.89857: handler run complete 19110 1726882585.89878: attempt loop complete, returning result 19110 1726882585.89885: _execute() done 19110 1726882585.89892: dumping result to json 19110 1726882585.89899: done dumping result, returning 19110 1726882585.89909: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' [0e448fcc-3ce9-5372-c19a-00000000053d] 19110 1726882585.89920: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000053d 19110 1726882585.90022: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000053d 19110 1726882585.90030: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 19110 1726882585.90100: no more pending results, returning what we have 19110 1726882585.90105: results queue empty 19110 1726882585.90106: checking for any_errors_fatal 19110 1726882585.90115: done checking for any_errors_fatal 19110 1726882585.90116: checking for max_fail_percentage 19110 1726882585.90118: done checking for max_fail_percentage 19110 1726882585.90119: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.90120: done checking to see if all hosts have failed 19110 1726882585.90121: getting the remaining hosts for this loop 19110 1726882585.90122: done getting the remaining hosts for this loop 19110 1726882585.90126: getting the next task for host managed_node1 19110 1726882585.90135: done getting next task for host managed_node1 19110 1726882585.90137: ^ task is: TASK: meta (flush_handlers) 19110 1726882585.90139: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.90144: getting variables 19110 1726882585.90146: in VariableManager get_vars() 19110 1726882585.90181: Calling all_inventory to load vars for managed_node1 19110 1726882585.90184: Calling groups_inventory to load vars for managed_node1 19110 1726882585.90187: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.90199: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.90203: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.90206: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.91304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.92369: done with get_vars() 19110 1726882585.92384: done getting variables 19110 1726882585.92442: in VariableManager get_vars() 19110 1726882585.92469: Calling all_inventory to load vars for managed_node1 19110 1726882585.92472: Calling groups_inventory to load vars for managed_node1 19110 1726882585.92474: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.92479: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.92481: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.92484: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.93808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.95660: done with get_vars() 19110 1726882585.95693: done queuing things up, now waiting for results queue to drain 19110 1726882585.95695: results queue empty 19110 1726882585.95696: checking for any_errors_fatal 19110 1726882585.95699: done checking for any_errors_fatal 19110 1726882585.95699: checking for max_fail_percentage 19110 1726882585.95700: done checking for max_fail_percentage 19110 1726882585.95701: checking to see if all hosts have failed and the running result is not ok 19110 1726882585.95702: done checking to see if all hosts have failed 19110 1726882585.95708: getting the remaining hosts for this loop 19110 1726882585.95709: done getting the remaining hosts for this loop 19110 1726882585.95711: getting the next task for host managed_node1 19110 1726882585.95714: done getting next task for host managed_node1 19110 1726882585.95716: ^ task is: TASK: meta (flush_handlers) 19110 1726882585.95717: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882585.95720: getting variables 19110 1726882585.95721: in VariableManager get_vars() 19110 1726882585.95727: Calling all_inventory to load vars for managed_node1 19110 1726882585.95730: Calling groups_inventory to load vars for managed_node1 19110 1726882585.95732: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.95736: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.95739: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.95741: Calling groups_plugins_play to load vars for managed_node1 19110 1726882585.97132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882585.98979: done with get_vars() 19110 1726882585.98999: done getting variables 19110 1726882585.99057: in VariableManager get_vars() 19110 1726882585.99067: Calling all_inventory to load vars for managed_node1 19110 1726882585.99070: Calling groups_inventory to load vars for managed_node1 19110 1726882585.99072: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882585.99076: Calling all_plugins_play to load vars for managed_node1 19110 1726882585.99079: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882585.99081: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.00310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.02297: done with get_vars() 19110 1726882586.02321: done queuing things up, now waiting for results queue to drain 19110 1726882586.02323: results queue empty 19110 1726882586.02324: checking for any_errors_fatal 19110 1726882586.02325: done checking for any_errors_fatal 19110 1726882586.02326: checking for max_fail_percentage 19110 1726882586.02327: done checking for max_fail_percentage 19110 1726882586.02327: checking to see if all hosts have failed and the running result is not ok 19110 1726882586.02328: done checking to see if all hosts have failed 19110 1726882586.02329: getting the remaining hosts for this loop 19110 1726882586.02330: done getting the remaining hosts for this loop 19110 1726882586.02332: getting the next task for host managed_node1 19110 1726882586.02335: done getting next task for host managed_node1 19110 1726882586.02336: ^ task is: None 19110 1726882586.02338: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882586.02339: done queuing things up, now waiting for results queue to drain 19110 1726882586.02340: results queue empty 19110 1726882586.02340: checking for any_errors_fatal 19110 1726882586.02341: done checking for any_errors_fatal 19110 1726882586.02342: checking for max_fail_percentage 19110 1726882586.02343: done checking for max_fail_percentage 19110 1726882586.02343: checking to see if all hosts have failed and the running result is not ok 19110 1726882586.02344: done checking to see if all hosts have failed 19110 1726882586.02345: getting the next task for host managed_node1 19110 1726882586.02347: done getting next task for host managed_node1 19110 1726882586.02348: ^ task is: None 19110 1726882586.02350: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882586.02391: in VariableManager get_vars() 19110 1726882586.02405: done with get_vars() 19110 1726882586.02410: in VariableManager get_vars() 19110 1726882586.02418: done with get_vars() 19110 1726882586.02422: variable 'omit' from source: magic vars 19110 1726882586.02452: in VariableManager get_vars() 19110 1726882586.02463: done with get_vars() 19110 1726882586.02485: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 19110 1726882586.02662: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19110 1726882586.02688: getting the remaining hosts for this loop 19110 1726882586.02689: done getting the remaining hosts for this loop 19110 1726882586.02692: getting the next task for host managed_node1 19110 1726882586.02694: done getting next task for host managed_node1 19110 1726882586.02696: ^ task is: TASK: Gathering Facts 19110 1726882586.02697: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882586.02699: getting variables 19110 1726882586.02700: in VariableManager get_vars() 19110 1726882586.02708: Calling all_inventory to load vars for managed_node1 19110 1726882586.02710: Calling groups_inventory to load vars for managed_node1 19110 1726882586.02712: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882586.02717: Calling all_plugins_play to load vars for managed_node1 19110 1726882586.02719: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882586.02722: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.04931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.06808: done with get_vars() 19110 1726882586.06828: done getting variables 19110 1726882586.06874: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 21:36:26 -0400 (0:00:00.193) 0:00:42.926 ****** 19110 1726882586.06899: entering _queue_task() for managed_node1/gather_facts 19110 1726882586.07202: worker is 1 (out of 1 available) 19110 1726882586.07213: exiting _queue_task() for managed_node1/gather_facts 19110 1726882586.07223: done queuing things up, now waiting for results queue to drain 19110 1726882586.07225: waiting for pending results... 19110 1726882586.07491: running TaskExecutor() for managed_node1/TASK: Gathering Facts 19110 1726882586.07594: in run() - task 0e448fcc-3ce9-5372-c19a-00000000056d 19110 1726882586.07613: variable 'ansible_search_path' from source: unknown 19110 1726882586.07653: calling self._execute() 19110 1726882586.07745: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882586.07760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882586.07777: variable 'omit' from source: magic vars 19110 1726882586.08151: variable 'ansible_distribution_major_version' from source: facts 19110 1726882586.08172: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882586.08182: variable 'omit' from source: magic vars 19110 1726882586.08212: variable 'omit' from source: magic vars 19110 1726882586.08247: variable 'omit' from source: magic vars 19110 1726882586.08294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882586.08372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882586.08398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882586.08423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882586.08439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882586.08500: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882586.08508: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882586.08515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882586.08620: Set connection var ansible_timeout to 10 19110 1726882586.08641: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882586.08661: Set connection var ansible_shell_executable to /bin/sh 19110 1726882586.08672: Set connection var ansible_shell_type to sh 19110 1726882586.08678: Set connection var ansible_connection to ssh 19110 1726882586.08687: Set connection var ansible_pipelining to False 19110 1726882586.08711: variable 'ansible_shell_executable' from source: unknown 19110 1726882586.08718: variable 'ansible_connection' from source: unknown 19110 1726882586.08724: variable 'ansible_module_compression' from source: unknown 19110 1726882586.08730: variable 'ansible_shell_type' from source: unknown 19110 1726882586.08735: variable 'ansible_shell_executable' from source: unknown 19110 1726882586.08742: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882586.08751: variable 'ansible_pipelining' from source: unknown 19110 1726882586.08760: variable 'ansible_timeout' from source: unknown 19110 1726882586.08771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882586.08950: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882586.08975: variable 'omit' from source: magic vars 19110 1726882586.08985: starting attempt loop 19110 1726882586.08991: running the handler 19110 1726882586.09010: variable 'ansible_facts' from source: unknown 19110 1726882586.09033: _low_level_execute_command(): starting 19110 1726882586.09045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882586.09832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882586.09849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.09871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.09890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.09933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.09950: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882586.09970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.09990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882586.10003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882586.10013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882586.10026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.10040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.10065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.10079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.10090: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882586.10103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.10187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.10204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882586.10219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.10400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.12071: stdout chunk (state=3): >>>/root <<< 19110 1726882586.12251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882586.12254: stdout chunk (state=3): >>><<< 19110 1726882586.12259: stderr chunk (state=3): >>><<< 19110 1726882586.12369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882586.12373: _low_level_execute_command(): starting 19110 1726882586.12376: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453 `" && echo ansible-tmp-1726882586.1228113-21022-84337449341453="` echo /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453 `" ) && sleep 0' 19110 1726882586.14695: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.14698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.14734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.14738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.14742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.15016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.15019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882586.15023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.15131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.16986: stdout chunk (state=3): >>>ansible-tmp-1726882586.1228113-21022-84337449341453=/root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453 <<< 19110 1726882586.17178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882586.17181: stdout chunk (state=3): >>><<< 19110 1726882586.17184: stderr chunk (state=3): >>><<< 19110 1726882586.17370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882586.1228113-21022-84337449341453=/root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882586.17374: variable 'ansible_module_compression' from source: unknown 19110 1726882586.17377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19110 1726882586.17379: variable 'ansible_facts' from source: unknown 19110 1726882586.17518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/AnsiballZ_setup.py 19110 1726882586.18160: Sending initial data 19110 1726882586.18165: Sent initial data (153 bytes) 19110 1726882586.20579: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882586.20592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.20683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.20708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.20747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.20759: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882586.20778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.20805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882586.20818: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882586.20833: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882586.20848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.20866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.20886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.20900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.20914: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882586.20930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.21009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.21030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882586.21044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.21167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.22902: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882586.22995: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882586.23092: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpyezu6bez /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/AnsiballZ_setup.py <<< 19110 1726882586.23185: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882586.25769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882586.25971: stderr chunk (state=3): >>><<< 19110 1726882586.25975: stdout chunk (state=3): >>><<< 19110 1726882586.25978: done transferring module to remote 19110 1726882586.25979: _low_level_execute_command(): starting 19110 1726882586.25981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/ /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/AnsiballZ_setup.py && sleep 0' 19110 1726882586.26558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882586.26577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.26593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.26615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.26668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.26682: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882586.26697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.26716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882586.26730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882586.26746: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882586.26769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.26787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.26806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.26821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.26835: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882586.26851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.26935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.26952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882586.26972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.27095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.28900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882586.28903: stdout chunk (state=3): >>><<< 19110 1726882586.28906: stderr chunk (state=3): >>><<< 19110 1726882586.28998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882586.29002: _low_level_execute_command(): starting 19110 1726882586.29005: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/AnsiballZ_setup.py && sleep 0' 19110 1726882586.29551: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882586.29572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.29587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.29606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.29646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.29663: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882586.29683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.29701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882586.29714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882586.29725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882586.29736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.29748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882586.29766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.29778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882586.29786: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882586.29797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.29869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.29885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882586.29899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.30033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.81009: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.46, "5m": 0.41, "15m": 0.22}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5<<< 19110 1726882586.81018: stdout chunk (state=3): >>>ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.551373Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626551373", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2799, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 733, "free": 2799}, "nocache": {"free": 3261, "used": 271}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239202304, "block_size": 4096, "block_total": 65519355, "block_available": 64511524, "block_used": 1007831, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"<<< 19110 1726882586.81028: stdout chunk (state=3): >>>], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19110 1726882586.82650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882586.82710: stderr chunk (state=3): >>><<< 19110 1726882586.82714: stdout chunk (state=3): >>><<< 19110 1726882586.82739: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.46, "5m": 0.41, "15m": 0.22}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "26", "epoch": "1726882586", "epoch_int": "1726882586", "date": "2024-09-20", "time": "21:36:26", "iso8601_micro": "2024-09-21T01:36:26.551373Z", "iso8601": "2024-09-21T01:36:26Z", "iso8601_basic": "20240920T213626551373", "iso8601_basic_short": "20240920T213626", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2799, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 733, "free": 2799}, "nocache": {"free": 3261, "used": 271}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239202304, "block_size": 4096, "block_total": 65519355, "block_available": 64511524, "block_used": 1007831, "inode_total": 131071472, "inode_available": 130998699, "inode_used": 72773, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882586.82957: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882586.82974: _low_level_execute_command(): starting 19110 1726882586.82977: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882586.1228113-21022-84337449341453/ > /dev/null 2>&1 && sleep 0' 19110 1726882586.83422: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882586.83435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882586.83454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.83476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882586.83522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882586.83538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882586.83632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882586.85432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882586.85499: stderr chunk (state=3): >>><<< 19110 1726882586.85507: stdout chunk (state=3): >>><<< 19110 1726882586.85528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882586.85547: handler run complete 19110 1726882586.85684: variable 'ansible_facts' from source: unknown 19110 1726882586.85796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.86214: variable 'ansible_facts' from source: unknown 19110 1726882586.86368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.86460: attempt loop complete, returning result 19110 1726882586.86475: _execute() done 19110 1726882586.86478: dumping result to json 19110 1726882586.86497: done dumping result, returning 19110 1726882586.86504: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-5372-c19a-00000000056d] 19110 1726882586.86509: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000056d 19110 1726882586.86914: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000056d 19110 1726882586.86917: WORKER PROCESS EXITING ok: [managed_node1] 19110 1726882586.87121: no more pending results, returning what we have 19110 1726882586.87124: results queue empty 19110 1726882586.87124: checking for any_errors_fatal 19110 1726882586.87125: done checking for any_errors_fatal 19110 1726882586.87126: checking for max_fail_percentage 19110 1726882586.87127: done checking for max_fail_percentage 19110 1726882586.87127: checking to see if all hosts have failed and the running result is not ok 19110 1726882586.87128: done checking to see if all hosts have failed 19110 1726882586.87128: getting the remaining hosts for this loop 19110 1726882586.87129: done getting the remaining hosts for this loop 19110 1726882586.87131: getting the next task for host managed_node1 19110 1726882586.87135: done getting next task for host managed_node1 19110 1726882586.87136: ^ task is: TASK: meta (flush_handlers) 19110 1726882586.87137: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882586.87140: getting variables 19110 1726882586.87141: in VariableManager get_vars() 19110 1726882586.87160: Calling all_inventory to load vars for managed_node1 19110 1726882586.87162: Calling groups_inventory to load vars for managed_node1 19110 1726882586.87166: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882586.87174: Calling all_plugins_play to load vars for managed_node1 19110 1726882586.87176: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882586.87178: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.88026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.89598: done with get_vars() 19110 1726882586.89621: done getting variables 19110 1726882586.89694: in VariableManager get_vars() 19110 1726882586.89703: Calling all_inventory to load vars for managed_node1 19110 1726882586.89705: Calling groups_inventory to load vars for managed_node1 19110 1726882586.89708: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882586.89712: Calling all_plugins_play to load vars for managed_node1 19110 1726882586.89714: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882586.89717: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.91052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.92822: done with get_vars() 19110 1726882586.92846: done queuing things up, now waiting for results queue to drain 19110 1726882586.92848: results queue empty 19110 1726882586.92849: checking for any_errors_fatal 19110 1726882586.92852: done checking for any_errors_fatal 19110 1726882586.92853: checking for max_fail_percentage 19110 1726882586.92854: done checking for max_fail_percentage 19110 1726882586.92857: checking to see if all hosts have failed and the running result is not ok 19110 1726882586.92862: done checking to see if all hosts have failed 19110 1726882586.92864: getting the remaining hosts for this loop 19110 1726882586.92865: done getting the remaining hosts for this loop 19110 1726882586.92868: getting the next task for host managed_node1 19110 1726882586.92872: done getting next task for host managed_node1 19110 1726882586.92874: ^ task is: TASK: Verify network state restored to default 19110 1726882586.92876: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882586.92878: getting variables 19110 1726882586.92879: in VariableManager get_vars() 19110 1726882586.92886: Calling all_inventory to load vars for managed_node1 19110 1726882586.92888: Calling groups_inventory to load vars for managed_node1 19110 1726882586.92891: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882586.92895: Calling all_plugins_play to load vars for managed_node1 19110 1726882586.92898: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882586.92900: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.94135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882586.95932: done with get_vars() 19110 1726882586.95954: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 21:36:26 -0400 (0:00:00.891) 0:00:43.817 ****** 19110 1726882586.96031: entering _queue_task() for managed_node1/include_tasks 19110 1726882586.96364: worker is 1 (out of 1 available) 19110 1726882586.96379: exiting _queue_task() for managed_node1/include_tasks 19110 1726882586.96390: done queuing things up, now waiting for results queue to drain 19110 1726882586.96391: waiting for pending results... 19110 1726882586.96676: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 19110 1726882586.96790: in run() - task 0e448fcc-3ce9-5372-c19a-000000000078 19110 1726882586.96813: variable 'ansible_search_path' from source: unknown 19110 1726882586.96861: calling self._execute() 19110 1726882586.96968: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882586.96983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882586.96999: variable 'omit' from source: magic vars 19110 1726882586.97364: variable 'ansible_distribution_major_version' from source: facts 19110 1726882586.97386: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882586.97396: _execute() done 19110 1726882586.97404: dumping result to json 19110 1726882586.97411: done dumping result, returning 19110 1726882586.97419: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-5372-c19a-000000000078] 19110 1726882586.97429: sending task result for task 0e448fcc-3ce9-5372-c19a-000000000078 19110 1726882586.97551: no more pending results, returning what we have 19110 1726882586.97560: in VariableManager get_vars() 19110 1726882586.97601: Calling all_inventory to load vars for managed_node1 19110 1726882586.97605: Calling groups_inventory to load vars for managed_node1 19110 1726882586.97609: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882586.97626: Calling all_plugins_play to load vars for managed_node1 19110 1726882586.97632: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882586.97636: Calling groups_plugins_play to load vars for managed_node1 19110 1726882586.98683: done sending task result for task 0e448fcc-3ce9-5372-c19a-000000000078 19110 1726882586.98686: WORKER PROCESS EXITING 19110 1726882587.07329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882587.09041: done with get_vars() 19110 1726882587.09067: variable 'ansible_search_path' from source: unknown 19110 1726882587.09081: we have included files to process 19110 1726882587.09082: generating all_blocks data 19110 1726882587.09084: done generating all_blocks data 19110 1726882587.09084: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19110 1726882587.09085: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19110 1726882587.09088: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19110 1726882587.09462: done processing included file 19110 1726882587.09466: iterating over new_blocks loaded from include file 19110 1726882587.09468: in VariableManager get_vars() 19110 1726882587.09479: done with get_vars() 19110 1726882587.09480: filtering new block on tags 19110 1726882587.09497: done filtering new block on tags 19110 1726882587.09499: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 19110 1726882587.09503: extending task lists for all hosts with included blocks 19110 1726882587.09532: done extending task lists 19110 1726882587.09533: done processing included files 19110 1726882587.09534: results queue empty 19110 1726882587.09535: checking for any_errors_fatal 19110 1726882587.09536: done checking for any_errors_fatal 19110 1726882587.09537: checking for max_fail_percentage 19110 1726882587.09538: done checking for max_fail_percentage 19110 1726882587.09539: checking to see if all hosts have failed and the running result is not ok 19110 1726882587.09539: done checking to see if all hosts have failed 19110 1726882587.09540: getting the remaining hosts for this loop 19110 1726882587.09541: done getting the remaining hosts for this loop 19110 1726882587.09543: getting the next task for host managed_node1 19110 1726882587.09547: done getting next task for host managed_node1 19110 1726882587.09548: ^ task is: TASK: Check routes and DNS 19110 1726882587.09550: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882587.09553: getting variables 19110 1726882587.09553: in VariableManager get_vars() 19110 1726882587.09566: Calling all_inventory to load vars for managed_node1 19110 1726882587.09569: Calling groups_inventory to load vars for managed_node1 19110 1726882587.09571: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882587.09576: Calling all_plugins_play to load vars for managed_node1 19110 1726882587.09579: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882587.09582: Calling groups_plugins_play to load vars for managed_node1 19110 1726882587.10860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882587.12743: done with get_vars() 19110 1726882587.12772: done getting variables 19110 1726882587.12813: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:36:27 -0400 (0:00:00.168) 0:00:43.985 ****** 19110 1726882587.12838: entering _queue_task() for managed_node1/shell 19110 1726882587.13197: worker is 1 (out of 1 available) 19110 1726882587.13210: exiting _queue_task() for managed_node1/shell 19110 1726882587.13222: done queuing things up, now waiting for results queue to drain 19110 1726882587.13224: waiting for pending results... 19110 1726882587.13522: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 19110 1726882587.13651: in run() - task 0e448fcc-3ce9-5372-c19a-00000000057e 19110 1726882587.13682: variable 'ansible_search_path' from source: unknown 19110 1726882587.13689: variable 'ansible_search_path' from source: unknown 19110 1726882587.13729: calling self._execute() 19110 1726882587.13822: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.13833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.13846: variable 'omit' from source: magic vars 19110 1726882587.14229: variable 'ansible_distribution_major_version' from source: facts 19110 1726882587.14246: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882587.14259: variable 'omit' from source: magic vars 19110 1726882587.14298: variable 'omit' from source: magic vars 19110 1726882587.14342: variable 'omit' from source: magic vars 19110 1726882587.14390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882587.14432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882587.14461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882587.14487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882587.14503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882587.14540: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882587.14548: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.14558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.14661: Set connection var ansible_timeout to 10 19110 1726882587.14683: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882587.14694: Set connection var ansible_shell_executable to /bin/sh 19110 1726882587.14700: Set connection var ansible_shell_type to sh 19110 1726882587.14706: Set connection var ansible_connection to ssh 19110 1726882587.14714: Set connection var ansible_pipelining to False 19110 1726882587.14739: variable 'ansible_shell_executable' from source: unknown 19110 1726882587.14748: variable 'ansible_connection' from source: unknown 19110 1726882587.14757: variable 'ansible_module_compression' from source: unknown 19110 1726882587.14766: variable 'ansible_shell_type' from source: unknown 19110 1726882587.14773: variable 'ansible_shell_executable' from source: unknown 19110 1726882587.14778: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.14784: variable 'ansible_pipelining' from source: unknown 19110 1726882587.14790: variable 'ansible_timeout' from source: unknown 19110 1726882587.14796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.14936: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882587.14952: variable 'omit' from source: magic vars 19110 1726882587.14969: starting attempt loop 19110 1726882587.14977: running the handler 19110 1726882587.14991: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882587.15014: _low_level_execute_command(): starting 19110 1726882587.15026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882587.15812: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.15830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.15847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.15873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.15915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.15927: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.15946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.15969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.15981: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.15991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.16002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.16013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.16027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.16038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.16050: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.16071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.16138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.16159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.16179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.16399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.18058: stdout chunk (state=3): >>>/root <<< 19110 1726882587.18254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.18261: stdout chunk (state=3): >>><<< 19110 1726882587.18268: stderr chunk (state=3): >>><<< 19110 1726882587.18398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.18402: _low_level_execute_command(): starting 19110 1726882587.18413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921 `" && echo ansible-tmp-1726882587.1829252-21057-182443960117921="` echo /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921 `" ) && sleep 0' 19110 1726882587.20289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.20304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.20319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.20336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.20388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.20404: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.20418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.20435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.20445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.20458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.20472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.20484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.20498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.20512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.20522: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.20534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.20617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.20633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.20647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.20847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.22746: stdout chunk (state=3): >>>ansible-tmp-1726882587.1829252-21057-182443960117921=/root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921 <<< 19110 1726882587.22943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.22946: stdout chunk (state=3): >>><<< 19110 1726882587.22948: stderr chunk (state=3): >>><<< 19110 1726882587.23271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882587.1829252-21057-182443960117921=/root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.23274: variable 'ansible_module_compression' from source: unknown 19110 1726882587.23277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882587.23279: variable 'ansible_facts' from source: unknown 19110 1726882587.23281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/AnsiballZ_command.py 19110 1726882587.23841: Sending initial data 19110 1726882587.23844: Sent initial data (156 bytes) 19110 1726882587.26806: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.26821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.26835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.26858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.26904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.26916: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.26929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.26949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.26967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.26981: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.26993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.27005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.27020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.27091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.27102: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.27114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.27309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.27326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.27341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.27528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.29288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882587.29389: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882587.29487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmpqb2p2h1y /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/AnsiballZ_command.py <<< 19110 1726882587.29579: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882587.31090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.31220: stderr chunk (state=3): >>><<< 19110 1726882587.31223: stdout chunk (state=3): >>><<< 19110 1726882587.31226: done transferring module to remote 19110 1726882587.31228: _low_level_execute_command(): starting 19110 1726882587.31230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/ /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/AnsiballZ_command.py && sleep 0' 19110 1726882587.32602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.32757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.32776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.32793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.32836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.32853: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.32875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.32897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.32912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.32926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.32942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.32966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.32988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.33084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.33096: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.33109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.33195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.33304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.33318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.33521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.35372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.35376: stdout chunk (state=3): >>><<< 19110 1726882587.35378: stderr chunk (state=3): >>><<< 19110 1726882587.35475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.35479: _low_level_execute_command(): starting 19110 1726882587.35482: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/AnsiballZ_command.py && sleep 0' 19110 1726882587.36888: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.36892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.36932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882587.36936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.36938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 19110 1726882587.36941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.37202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.37206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.37213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.37316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.51377: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2884sec preferred_lft 2884sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:27.503789", "end": "2024-09-20 21:36:27.512081", "delta": "0:00:00.008292", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882587.52594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882587.52598: stdout chunk (state=3): >>><<< 19110 1726882587.52600: stderr chunk (state=3): >>><<< 19110 1726882587.52738: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2884sec preferred_lft 2884sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:27.503789", "end": "2024-09-20 21:36:27.512081", "delta": "0:00:00.008292", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882587.52742: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882587.52746: _low_level_execute_command(): starting 19110 1726882587.52748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882587.1829252-21057-182443960117921/ > /dev/null 2>&1 && sleep 0' 19110 1726882587.54351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.54367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.54382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.54398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.54440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.54452: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.54468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.54485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.54496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.54505: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.54516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.54527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.54541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.54552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.54562: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.54577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.54652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.54677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.54693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.54817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.56781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.56823: stderr chunk (state=3): >>><<< 19110 1726882587.56827: stdout chunk (state=3): >>><<< 19110 1726882587.56971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.56982: handler run complete 19110 1726882587.56985: Evaluated conditional (False): False 19110 1726882587.56988: attempt loop complete, returning result 19110 1726882587.56990: _execute() done 19110 1726882587.56991: dumping result to json 19110 1726882587.56993: done dumping result, returning 19110 1726882587.56995: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0e448fcc-3ce9-5372-c19a-00000000057e] 19110 1726882587.56997: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000057e 19110 1726882587.57077: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000057e ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008292", "end": "2024-09-20 21:36:27.512081", "rc": 0, "start": "2024-09-20 21:36:27.503789" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2884sec preferred_lft 2884sec inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 19110 1726882587.57155: no more pending results, returning what we have 19110 1726882587.57159: results queue empty 19110 1726882587.57160: checking for any_errors_fatal 19110 1726882587.57162: done checking for any_errors_fatal 19110 1726882587.57163: checking for max_fail_percentage 19110 1726882587.57179: done checking for max_fail_percentage 19110 1726882587.57181: checking to see if all hosts have failed and the running result is not ok 19110 1726882587.57182: done checking to see if all hosts have failed 19110 1726882587.57182: getting the remaining hosts for this loop 19110 1726882587.57185: done getting the remaining hosts for this loop 19110 1726882587.57189: getting the next task for host managed_node1 19110 1726882587.57196: done getting next task for host managed_node1 19110 1726882587.57199: ^ task is: TASK: Verify DNS and network connectivity 19110 1726882587.57202: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882587.57206: getting variables 19110 1726882587.57208: in VariableManager get_vars() 19110 1726882587.57240: Calling all_inventory to load vars for managed_node1 19110 1726882587.57244: Calling groups_inventory to load vars for managed_node1 19110 1726882587.57248: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882587.57260: Calling all_plugins_play to load vars for managed_node1 19110 1726882587.57418: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882587.57425: Calling groups_plugins_play to load vars for managed_node1 19110 1726882587.57980: WORKER PROCESS EXITING 19110 1726882587.59945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882587.61793: done with get_vars() 19110 1726882587.61824: done getting variables 19110 1726882587.61899: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:36:27 -0400 (0:00:00.490) 0:00:44.476 ****** 19110 1726882587.61932: entering _queue_task() for managed_node1/shell 19110 1726882587.62289: worker is 1 (out of 1 available) 19110 1726882587.62301: exiting _queue_task() for managed_node1/shell 19110 1726882587.62313: done queuing things up, now waiting for results queue to drain 19110 1726882587.62315: waiting for pending results... 19110 1726882587.62616: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 19110 1726882587.62753: in run() - task 0e448fcc-3ce9-5372-c19a-00000000057f 19110 1726882587.62781: variable 'ansible_search_path' from source: unknown 19110 1726882587.62788: variable 'ansible_search_path' from source: unknown 19110 1726882587.62831: calling self._execute() 19110 1726882587.62929: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.62940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.62955: variable 'omit' from source: magic vars 19110 1726882587.63383: variable 'ansible_distribution_major_version' from source: facts 19110 1726882587.63401: Evaluated conditional (ansible_distribution_major_version != '6'): True 19110 1726882587.63574: variable 'ansible_facts' from source: unknown 19110 1726882587.64471: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 19110 1726882587.64483: variable 'omit' from source: magic vars 19110 1726882587.64531: variable 'omit' from source: magic vars 19110 1726882587.64579: variable 'omit' from source: magic vars 19110 1726882587.64629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19110 1726882587.64670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19110 1726882587.64695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19110 1726882587.64724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882587.64741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19110 1726882587.64782: variable 'inventory_hostname' from source: host vars for 'managed_node1' 19110 1726882587.64797: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.64805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.64926: Set connection var ansible_timeout to 10 19110 1726882587.64956: Set connection var ansible_module_compression to ZIP_DEFLATED 19110 1726882587.64970: Set connection var ansible_shell_executable to /bin/sh 19110 1726882587.64978: Set connection var ansible_shell_type to sh 19110 1726882587.64985: Set connection var ansible_connection to ssh 19110 1726882587.64994: Set connection var ansible_pipelining to False 19110 1726882587.65028: variable 'ansible_shell_executable' from source: unknown 19110 1726882587.65038: variable 'ansible_connection' from source: unknown 19110 1726882587.65052: variable 'ansible_module_compression' from source: unknown 19110 1726882587.65060: variable 'ansible_shell_type' from source: unknown 19110 1726882587.65071: variable 'ansible_shell_executable' from source: unknown 19110 1726882587.65078: variable 'ansible_host' from source: host vars for 'managed_node1' 19110 1726882587.65086: variable 'ansible_pipelining' from source: unknown 19110 1726882587.65094: variable 'ansible_timeout' from source: unknown 19110 1726882587.65103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 19110 1726882587.65250: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882587.65275: variable 'omit' from source: magic vars 19110 1726882587.65285: starting attempt loop 19110 1726882587.65291: running the handler 19110 1726882587.65305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 19110 1726882587.65327: _low_level_execute_command(): starting 19110 1726882587.65339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19110 1726882587.66999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.67018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.67035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.67054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.67098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.67109: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.67126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.67148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.67159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.67171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.67182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.67194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.67208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.67221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.67234: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.67250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.67326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.67355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.67374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.67503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.69204: stdout chunk (state=3): >>>/root <<< 19110 1726882587.69368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.69371: stdout chunk (state=3): >>><<< 19110 1726882587.69383: stderr chunk (state=3): >>><<< 19110 1726882587.69490: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.69499: _low_level_execute_command(): starting 19110 1726882587.69501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618 `" && echo ansible-tmp-1726882587.6940117-21074-71317466837618="` echo /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618 `" ) && sleep 0' 19110 1726882587.70051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.70069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.70091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.70124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.70175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.70189: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.70203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.70220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.70232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.70243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.70255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.70288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.70311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.70324: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.70336: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.70350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.70426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.70450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.70468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.70601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.72604: stdout chunk (state=3): >>>ansible-tmp-1726882587.6940117-21074-71317466837618=/root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618 <<< 19110 1726882587.72790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.72793: stdout chunk (state=3): >>><<< 19110 1726882587.72796: stderr chunk (state=3): >>><<< 19110 1726882587.72972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882587.6940117-21074-71317466837618=/root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.72976: variable 'ansible_module_compression' from source: unknown 19110 1726882587.72978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-191108pnkimox/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19110 1726882587.72980: variable 'ansible_facts' from source: unknown 19110 1726882587.73030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/AnsiballZ_command.py 19110 1726882587.73658: Sending initial data 19110 1726882587.73665: Sent initial data (155 bytes) 19110 1726882587.76671: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.77084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.77102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.77120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.77165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.77280: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.77295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.77314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.77327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.77339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.77350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.77365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.77382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.77394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.77407: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.77421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.77726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.77749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.77770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.77904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.79768: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19110 1726882587.79772: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19110 1726882587.79853: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19110 1726882587.79961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-191108pnkimox/tmp_nfwuk0t /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/AnsiballZ_command.py <<< 19110 1726882587.80046: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19110 1726882587.81503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.81753: stderr chunk (state=3): >>><<< 19110 1726882587.81759: stdout chunk (state=3): >>><<< 19110 1726882587.81761: done transferring module to remote 19110 1726882587.81765: _low_level_execute_command(): starting 19110 1726882587.81767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/ /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/AnsiballZ_command.py && sleep 0' 19110 1726882587.83460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.83688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.83705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.83724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.83824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.83836: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.83849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.83874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.83890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.83913: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.83927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.83941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.84016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.84121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.84137: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.84152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.84348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.84370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.84386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.84668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882587.86479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882587.86482: stdout chunk (state=3): >>><<< 19110 1726882587.86484: stderr chunk (state=3): >>><<< 19110 1726882587.86571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882587.86574: _low_level_execute_command(): starting 19110 1726882587.86576: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/AnsiballZ_command.py && sleep 0' 19110 1726882587.87985: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19110 1726882587.88118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.88134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.88151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.88196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.88225: stderr chunk (state=3): >>>debug2: match not found <<< 19110 1726882587.88239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.88289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19110 1726882587.88302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 19110 1726882587.88313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19110 1726882587.88332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882587.88346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882587.88368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882587.88381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 19110 1726882587.88393: stderr chunk (state=3): >>>debug2: match found <<< 19110 1726882587.88407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882587.88522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882587.88669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882587.88686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882587.88889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882588.18451: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 12200\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:28.018737", "end": "2024-09-20 21:36:28.182850", "delta": "0:00:00.164113", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19110 1726882588.19862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 19110 1726882588.20540: stderr chunk (state=3): >>><<< 19110 1726882588.20544: stdout chunk (state=3): >>><<< 19110 1726882588.20708: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 12200\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:28.018737", "end": "2024-09-20 21:36:28.182850", "delta": "0:00:00.164113", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 19110 1726882588.20712: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19110 1726882588.20715: _low_level_execute_command(): starting 19110 1726882588.20718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882587.6940117-21074-71317466837618/ > /dev/null 2>&1 && sleep 0' 19110 1726882588.22047: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19110 1726882588.22051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19110 1726882588.22085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 19110 1726882588.22089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19110 1726882588.22092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19110 1726882588.22269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19110 1726882588.22343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19110 1726882588.22351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19110 1726882588.22452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19110 1726882588.24272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19110 1726882588.24344: stderr chunk (state=3): >>><<< 19110 1726882588.24347: stdout chunk (state=3): >>><<< 19110 1726882588.24369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19110 1726882588.24571: handler run complete 19110 1726882588.24573: Evaluated conditional (False): False 19110 1726882588.24576: attempt loop complete, returning result 19110 1726882588.24578: _execute() done 19110 1726882588.24580: dumping result to json 19110 1726882588.24582: done dumping result, returning 19110 1726882588.24584: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-5372-c19a-00000000057f] 19110 1726882588.24586: sending task result for task 0e448fcc-3ce9-5372-c19a-00000000057f 19110 1726882588.24660: done sending task result for task 0e448fcc-3ce9-5372-c19a-00000000057f 19110 1726882588.24666: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.164113", "end": "2024-09-20 21:36:28.182850", "rc": 0, "start": "2024-09-20 21:36:28.018737" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 11730 0 --:--:-- --:--:-- --:--:-- 12200 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2508 0 --:--:-- --:--:-- --:--:-- 2508 19110 1726882588.24744: no more pending results, returning what we have 19110 1726882588.24747: results queue empty 19110 1726882588.24749: checking for any_errors_fatal 19110 1726882588.24761: done checking for any_errors_fatal 19110 1726882588.24762: checking for max_fail_percentage 19110 1726882588.24765: done checking for max_fail_percentage 19110 1726882588.24766: checking to see if all hosts have failed and the running result is not ok 19110 1726882588.24767: done checking to see if all hosts have failed 19110 1726882588.24768: getting the remaining hosts for this loop 19110 1726882588.24771: done getting the remaining hosts for this loop 19110 1726882588.24775: getting the next task for host managed_node1 19110 1726882588.24784: done getting next task for host managed_node1 19110 1726882588.24786: ^ task is: TASK: meta (flush_handlers) 19110 1726882588.24788: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882588.24792: getting variables 19110 1726882588.24793: in VariableManager get_vars() 19110 1726882588.24823: Calling all_inventory to load vars for managed_node1 19110 1726882588.24826: Calling groups_inventory to load vars for managed_node1 19110 1726882588.24830: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882588.24841: Calling all_plugins_play to load vars for managed_node1 19110 1726882588.24844: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882588.24847: Calling groups_plugins_play to load vars for managed_node1 19110 1726882588.27322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882588.29293: done with get_vars() 19110 1726882588.29324: done getting variables 19110 1726882588.29416: in VariableManager get_vars() 19110 1726882588.29427: Calling all_inventory to load vars for managed_node1 19110 1726882588.29429: Calling groups_inventory to load vars for managed_node1 19110 1726882588.29431: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882588.29436: Calling all_plugins_play to load vars for managed_node1 19110 1726882588.29438: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882588.29449: Calling groups_plugins_play to load vars for managed_node1 19110 1726882588.30971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882588.35658: done with get_vars() 19110 1726882588.35696: done queuing things up, now waiting for results queue to drain 19110 1726882588.35698: results queue empty 19110 1726882588.35699: checking for any_errors_fatal 19110 1726882588.35702: done checking for any_errors_fatal 19110 1726882588.35703: checking for max_fail_percentage 19110 1726882588.35704: done checking for max_fail_percentage 19110 1726882588.35705: checking to see if all hosts have failed and the running result is not ok 19110 1726882588.35706: done checking to see if all hosts have failed 19110 1726882588.35707: getting the remaining hosts for this loop 19110 1726882588.35707: done getting the remaining hosts for this loop 19110 1726882588.35710: getting the next task for host managed_node1 19110 1726882588.35714: done getting next task for host managed_node1 19110 1726882588.35715: ^ task is: TASK: meta (flush_handlers) 19110 1726882588.35717: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882588.35719: getting variables 19110 1726882588.35720: in VariableManager get_vars() 19110 1726882588.35728: Calling all_inventory to load vars for managed_node1 19110 1726882588.35731: Calling groups_inventory to load vars for managed_node1 19110 1726882588.35733: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882588.35738: Calling all_plugins_play to load vars for managed_node1 19110 1726882588.35740: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882588.35743: Calling groups_plugins_play to load vars for managed_node1 19110 1726882588.36949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882588.39377: done with get_vars() 19110 1726882588.39402: done getting variables 19110 1726882588.39454: in VariableManager get_vars() 19110 1726882588.39537: Calling all_inventory to load vars for managed_node1 19110 1726882588.39540: Calling groups_inventory to load vars for managed_node1 19110 1726882588.39552: Calling all_plugins_inventory to load vars for managed_node1 19110 1726882588.39561: Calling all_plugins_play to load vars for managed_node1 19110 1726882588.39581: Calling groups_plugins_inventory to load vars for managed_node1 19110 1726882588.39589: Calling groups_plugins_play to load vars for managed_node1 19110 1726882588.40973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19110 1726882588.43188: done with get_vars() 19110 1726882588.43213: done queuing things up, now waiting for results queue to drain 19110 1726882588.43215: results queue empty 19110 1726882588.43216: checking for any_errors_fatal 19110 1726882588.43217: done checking for any_errors_fatal 19110 1726882588.43218: checking for max_fail_percentage 19110 1726882588.43219: done checking for max_fail_percentage 19110 1726882588.43220: checking to see if all hosts have failed and the running result is not ok 19110 1726882588.43221: done checking to see if all hosts have failed 19110 1726882588.43221: getting the remaining hosts for this loop 19110 1726882588.43222: done getting the remaining hosts for this loop 19110 1726882588.43225: getting the next task for host managed_node1 19110 1726882588.43228: done getting next task for host managed_node1 19110 1726882588.43229: ^ task is: None 19110 1726882588.43231: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19110 1726882588.43232: done queuing things up, now waiting for results queue to drain 19110 1726882588.43233: results queue empty 19110 1726882588.43233: checking for any_errors_fatal 19110 1726882588.43234: done checking for any_errors_fatal 19110 1726882588.43235: checking for max_fail_percentage 19110 1726882588.43236: done checking for max_fail_percentage 19110 1726882588.43236: checking to see if all hosts have failed and the running result is not ok 19110 1726882588.43237: done checking to see if all hosts have failed 19110 1726882588.43238: getting the next task for host managed_node1 19110 1726882588.43240: done getting next task for host managed_node1 19110 1726882588.43241: ^ task is: None 19110 1726882588.43242: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 21:36:28 -0400 (0:00:00.814) 0:00:45.291 ****** =============================================================================== fedora.linux_system_roles.network : Check which packages are installed --- 2.00s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which services are running ---- 1.97s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.53s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.29s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.24s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface lsr27 --------------------------------------------- 1.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.04s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 0.90s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.90s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Verify DNS and network connectivity ------------------------------------- 0.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.81s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 19110 1726882588.43452: RUNNING CLEANUP